Dumping out some follow-up thoughts.

I sometimes think this whole operation might’ve better served as a micro-blog. But I can’t decide whether that’d be annoying or not?

 

Reading

I have a lot of heavy-lifts in terms of projects this season, but I’ve got three books I’ve started to meander through. I doubt that I’ll release comprehensive notes/excerpts on them, although maybe a post or two might slip out:

Impro was on the to-read list for a comically long time, sitting in my kindle library. It’s not that big so I got into it during my last flight. the Long Now summary book would serve as a fun creative aide, to just sort of slam my way through. The third book is the much-ballyhooed “NASA book about aliens”. It’s a bit of a tome but I thought it was too fun a topic to not dive through.

 

Autonomous Vehicle Update

Google says we should blame Google when their autonomous cars get ticketed.  Corporate persons accept the legal responsibility for the work algorithms do.”

(I wrote a loose post on autonomous cars last year. The legal/institutional bottlenecks are going to be interesting to watch shake out.)

California will start granting licenses for autonomous cars.

 

Sensemaking Machines Update

OnlyBoth:

CEO Valdes-Perez describes the self-funded OnlyBoth as taking sort of a reverse approach to IBM’s Watson artificial intelligence technology, which can make sense out of unstructured information, as was on display during its famous Jeopardy! game show performance in 2011 (Hear Valdes-Perez compare the technologies below or here). OnlyBoth algorithms sort through structured big data —  initially on 3,122 U.S. colleges and universities described by 190 attributes – and spit out fun facts and comparisons in the form of perfect English sentences. The company’s motto:  “A sentence is worth 1,000 data”.

“We’re about structured data in, unstructured data out,” says Valdes-Perez, a computer scientist and adjunct professor at CMU. “It was a hard problem to solve.”

For example, when I popped “Massachusetts Institute of Technology” (or rather, “MIT”) into the search box on OnlyBoth, the Web-based service turned out this gem: “MIT spends the most on research ($1,128M) among all 3,122 colleges,” plus it shared some comparative info for Johns Hopkins, Stanford and others. You can get up to 25 insights per school, and can sort by topics, such as dorm capacity and Rhodes Scholar alumni. You can also discover “surprising” facts and compare schools to their neighbors and rivals. The app is linked to Facebook, Twitter and other social apps in case you want to share findings with your connections, and Valdes-Perez says he expects this is one of the main ways that people will learn about OnlyBoth.

 

Elegant Defenses of the Unacceptable

I don’t feel like writing a special-snowflake story about my time in public school.

One thing I wish I had seen in school, though: more elegant, convincing defenses of an unaccepted (or even, “now obviously horrid”) idea.

I can easily recall two categories of exceptions to the Symphony of Success that was my history/science curriculum: as a rung in Wittgenstein’s ladder, or as an un-serious historical setup (glossing the Ptolemaic astronomical model, to explain “what people thought” before the heliocentric model).

I can see why we wouldn’t spend time treating discredited ideas earnestly, but exposure to that concept might have saved me a lot of lost time and effort. That experience might’ve afforded a view of history where intelligence is not necessarily related to “wrongness”, and it might have served as a stark reminder that the world is terribly complicated, and that many obvious things are not obvious.

Until relatively recently in my life, I never considered seeking out honest, erudite defenses of the Bad Guys as they argued themselves (as opposed to a somewhat common desire to revise the Bad Guys’ argument to be palatable now through obvious omission e.g. “The South did not secede because of slavery”),

 

Deconstruction

A great guide from Nick Szabo, whose writings I’ve been exploring periodically:

The popular epithet “deconstruction” comes from hermeneutics. “Dekonstruction”, as originated by Heidegger, did not, contrary to its current popular usage, mean “destructive criticism”. The term was popularized by Derrida, but in a context where it was accompanied by destructive criticism. Heidegger was very interested in reading philosophy in the original Greek, and noticed that translators tended to add their own interpretations as they translate. These interpretations accumulate as “constructions”, and a doctrine, whether translated or reinterpreted in some other manner (for example, a law reinterpreted by a judge), accumulates these constructions over time, becoming a new doctrine. Heidegger, desiring to unearth the original Greek thinkers, set about to remove such constructions.

Deconstruction in its “postmodern” construction is usually applied to ferret out a bias one wants to remove, and has tended to get mixed up in the literature alongside criticism of those biases. So guess what, deconstruction has acquired an new interpretation, a new construction, “destructive criticism”. But deconstruction in its original sense is not a criticism at all, it is simply a theory about how traditions evolve, namely via the accumulation of constructions, along with a methodology for ferreting out constructions that have for some other reason been deemed to be undesired.

Of course, the above analysis is itself a deconstruction of the term “deconstruction”.

To continue our reflexive deconstruction, and thereby learn some more about its method and use, Heidegger was in turn inspired by earlier hermeneutics, in particular the Reformation Biblical translators like Luther who, in our postmodern parlance, were trying to deconstruct the Catholic Church’s interpretations to get back to the supposedly inspired original text. Removing Roman doctrines such as tithes, indulgences, and spiritual loyalty to Rome had economically and politically beneficial effects to un-Romanized Europe[6] so there was quite a motivation for this seemingly obscure task. Of course, modern scholars have deconstructed further and found that there was no “original” text but an evolution of texts from the Essenes, the Dead Sea Scrolls, St. Paul, then (finally) the Gospels.

I actually read this one a couple of weeks ago- I was reminded of the topic by Andrew Sullivan this weekend, in this post:

From ages 17 to 23, Jessica Misener was a born-again Christian. And then she went to graduate school at Yale, learned a bit of Hebrew and Greek, delved into studying Scripture, and eventually lost her faith, which “hinged almost solely on believing the Bible to be the literal, inspired word of God”:

[…] This is all the more curious given that, by her own admission, the evangelical position she once held is something of a modern invention, and that most Christian traditions outside of evangelicalism, such as the Roman Catholic, Anglican, and Orthodox ones, to say nothing of more liberal strains of Protestantism, hold different and often more nuanced and complex understandings of the Bible. In fact, out of fairness to my evangelical friends, I’d even say that within conservative evangelical theological circles you can find approaches to the Bible that uphold inerrancy without reducing it to a simplistic literalism. Misener doesn’t seem to show any interest in any these alternatives.