Every day I struggle with this nagging feeling that I should be writing something. It's ridiculous. I have more text editors and word processors than I have cameras (I have many cameras.). At least I take pictures every day; though I can't really tell you why I do that either. I read about writing. I think about how to connect my iOS devices to my Tinderbox file, thinking that my "problem" is that I'm simply not at my computer enough.
But that's not really it either.
There are two things, I think, that have kind of kept me away from the keyboard. Neither is a "reason," per se, and certainly not an excuse. They're more like a barrier I need to overcome somehow, if I want to satisfy this urge. Consider this a step in that direction.
The first thing is, why bother? With photographs, it's almost as if I can't help it. Even if I deliberately leave the condo without a camera, something I seldom do, I still have my phone with me and, without even wanting to, I'll see something and I'll take a picture of it. I'm keeping the cloud service and mass storage device companies in business, I think. Anyway, it seems to give me pleasure.
But with blogging, it seems like it's kind of pointless. For some people, it's how they hope to make a living. So their efforts are directed toward capturing attention somehow, and that keeps them focused and motivated. For me, just another guy with an opinion, the question is, "Does the world really need to know what Dave Rogers thinks?"
The answer is clearly, "No."
Back during the run-up to the Iraq War, I was blogging a lot. I still have all those posts in a Tinderbox file. They were hosted on Apple's homepage.mac.com service, which is just a memory now. Those bastards. I digress. But during the run-up to the Iraq War, I was extremely motivated to write. It was like watching a slow-motion train wreck and being utterly powerless to stop it.
In many ways, it was perhaps the golden age of blogging. So many people were arguing for or against the notion of invading Iraq. Of course, the Bush Administration wasn't listening to anyone but themselves and the folks who agreed with them, and Congress was seemingly afraid to appear wimpy or something. So it was mostly just bloggers arguing amongst ourselves, sorting ourselves into our respective camps, prior to which we hadn't known we'd belonged.
It was exhausting in a way. After the invasion, it seemed pointless to go on commenting about it, and instead I just hoped for the best. We all know how it turned out, but "I told you so," isn't exactly deathless prose.
I'd blogged pretty regularly before the war, of course. Partly because of the novelty, I could self-publish online and it was pretty exciting in the way that an Apple ][+ first allowed me to put text on a TV screen. It was something many of us never really knew we could do before. I met a bunch of nice people through blogging, and still keep in touch with many of them. There was a natural give-and-take, pointing to each other's posts and commenting about this and that on a near daily basis.
I had a fairly long-running online debate with Doc Searls regarding the notion that "markets are conversations." I disagreed strongly, arguing that the social and the commercial are orthogonal to one another, their intersection being a one-dimensional line; that efforts to see a greater overlap in the two merely leads to the commercialization of our social lives. Which is where we are today, with "social media." We're all curating our rockstar (or homespun, or vegan, or athletic, or nerd, or pick your lifestyle) lives for the benefit of advertisers. Yay.
So I think I can safely say I've never actually changed anyone's mind about anything through blogging. At least, I couldn't stop the Iraq War, I couldn't stop the commercialization (or monetization, same thing) of our social lives, and it's pretty clear I haven't stopped Google from basically destroying the idea of privacy, or persuaded anyone that large unaccountable corporations with access to vast quantities of information about individuals is a disaster in progress.
But another reason why I wrote with some frequency back then was because I was going through a difficult period in my life. I was separated and eventually divorced. I was seeing a therapist and learning a great deal about myself and life in general. Writing was a way of processing all that.
I was uncomfortable.
Today, I'm very comfortable. I'm retired. I live on a pension that is adequate for my needs, so far. I have friends in my life. I have a significant other (and, oh my, is that an artless term, but "girlfriend" seems so silly when you're approaching 60). I have a good dog. I'm quite comfortable. There is no angst compelling me to perform public acts of self-examination.
I'm comfortable, the novelty has worn off, and I clearly have no means to influence the course of events of the day.
But I still like writing.
These days, I mostly write a little update to the members of my condo association every few weeks. Our attorney proofreads it. "Vanilla" being the preferred voice. I'm also editing a novel my friend wrote. (Please don't feel sorry for him. It's easier to edit another writer's work than to correct one's own.)
But I like to write. I like seeing a thought in my head appear before my eyes on this stupid screen. It still appeals to me for some reason.
So, maybe the world doesn't really care what Dave Rogers thinks. Maybe it's only important for Dave Rogers to learn what Dave Rogers thinks, and maybe the best way to do that is to write those thoughts down and find out what they are.
Well, it'll have to do.
Now more social! Added a follow me twitter link to the sidebar, and a link to my Flickr page.
Now to see if it worked...
Thoughts Too Long to Tweet
I was thinking about control systems as I was walking this morning. We design systems for a particular output from a given set of inputs. In photography, we have a system that tries to measure the "color" or "temperature" of the prevailing light, and this is called "white balance." If the camera doesn't correctly identify the color of the light, the photograph it produces will have a color cast that we doesn't match what we saw with our eyes. Our brain does this automatically, the imaging system must try to mimic what the brain does.
Most of the time, most of us use "automatic" white balance, and the camera usually does a pretty good job of correctly identifying the color of the light, unless you're indoors in a very mixed lighting (incandescent, LED and fluorescent) environment. As you get more experienced in photography, (Photographers, don't get upset! I'm talking about people who are just beginning their exploration into photography.) you might use "scene modes." You find that the "Sunset" mode yields better sunrises and sunsets, with deeper reds and a more dramatic rendering of the scene than the image from the automatic mode. If your curiosity leads you to look at a photo's exif data to see what's different between a "Sunset" mode image and an "Automatic" mode image, you'd discover that the "Sunset" mode has set the white balance to "Cloudy" and has input -.7ev exposure compensation (effectively reducing the overall exposure - making the scene darker).
So, to get the dramatic image, we "lie" to the system, we deliberately input errors to achieve an output the system isn't designed to produce "automatically" for the given set of inputs.
It's important to keep in mind that this is a fairly "static" system. Sure, the light's changing rapidly, but you're grabbing one set of data inputs, and you usually want only one output. It's not "dynamic" in the sense that you're constantly changing the white balance or exposure compensation throughout a certain sampling time. It's a relatively discrete event, and there's little inherent feedback within the system itself, excluding the photographer.
Perhaps a somewhat more dynamic example might be when an electric guitarist uses amplifier feedback to create some kind of desired artistic effect. It takes some skill, because sometimes the results aren't exactly artistic. Is the artist "in control" or is he not an "artist?"
What prompted this line of thought was the idea that Fox News has been deliberately introducing erroneous data into the media system to achieve a particular output, which the media system previously hadn't been yielding. Now, that's not to say the media system prior to Fox News was perfect or ideal in any way, just that Fox News treated "information" in a vastly different way that the previous regime.
Which relates to this whole Donald Trump phenomenon, which I am so tired of seeing in my Twitter feed.
Fox News has been deliberately introducing errors into the media system to obtain a desired output. Presumably, Fox News believed it was "in control" of this system.
Control is largely an illusion.
At best, there are systems which are designed to exhibit a certain, fairly narrow, predictable range of behaviors within a defined set of limits - if the system is "in control." (It's not, we're just saying that if the outputs are within a desired, defined set of limits, we call that "control.") In a broader perspective, this is also part of the "phase space" of the system in chaos theory. Though a phase space for a given system may naturally exceed the "desired" control limits; and when a system is "out of control" from our perspective, it's still within its natural, or inherent, phase space.
When you lose control of your car, you could end up in the ditch, or in a field, or hitting another car, but you won't end up on the moon. The car was "out of control," but there were a limited (though still very large) number of options where that event could end. ("Ending" itself being a somewhat arbitrary distinction, causality and chain of events being what they are.)
Fox News has been pumping bad information, "errors," into a certain segment of the social system (Fox New viewers) for decades now. This segment has low information, and they've been conditioned to mistrust alternative mainstream sources. Now a candidate comes along that essentially takes advantage of a low information electorate with a campaign and a narrative that is relatively consistent with the Fox News narrative, but is willing to exploit the perturbed nature of the system (low information, high mistrust) to achieve aims far outside the control limits of the establishment Republican Party.
So, Donald Trump and his electorate are "out of control."
An interesting question would be, "What are the contours of the phase space of a Donald Trump presidency?"
A shorter take is, "as you sow, so shall ye reap," or "garbage in = garbage out."
Inevitably, if there is any hope of preserving any sort of "system," with desired outputs, one has to take out the garbage.
Playing to Type
You have to wonder if scientists and researchers ever read or watched any science fiction as kids. So many have said that SF got them interested in science, but then how do you explain utterances like this one:
These are tools that we designed that we can control. We should explicitly be designing these systems such that we are able to control them and where we fear there’s a risk that we’re not able to control them, then that’s I think when we should be slowing down, just as we have in many other sectors, from nuclear development to chemical weapons or the like.
This is from Mustafa Suleyman, cofounder of a company called DeepMind, owned by (surprise!) Google, in an article on techworld. It's about Google's efforts in developing Artificial Intelligence.
It's the classic science fiction plot, all the way back to Frankenstein fer chrissakes! By the time you begin to "fear there's a risk," it's already out of control! Because for every moment prior to that, your desire tells you that you can control it, because you must believe that to continue doing the dangerous thing you want to be doing! You don't experience fear until you realize it's out of control!
And if Mr. Suleyman believes nuclear development and chemical weapons are "under control," well, he's delusional. We spend an awful lot of time and money working around the error of nuclear development and chemical weapons. One American political party would even have you believe that we went to war and killed hundreds of thousands of people because they weren't "under control." They are decidedly not. We just try to manage the risk, and some day we'll fail. And, oh by the way, they're not intelligent agents who may interpret efforts at control as something antagonistic to their programming - damage, to be routed around, so to speak.
I just think it's astonishing how foolish we are to believe we can control anything. Most of us can't even control ourselves. The best we can do is develop error-tolerant systems that are designed to mitigate loss-of-control situations, like air bags, ever so useful in an era of cell phones and Facebook Messenger. I suppose the Republicans once believed they could control Donald Trump. Hah!
Anyway, I'm not really afraid of Artificial Intelligence. If we ever manage to create it, and I suspect we will, we may catch a break and it'll be benign and help us correct many of our mistakes. My guess is, if it is benign, we'll resent it and go to war against it anyway, because that's just the perverse aspect of human nature. If it's hostile, well, it's just game on from the beginning.
Either way, it's probably a good thing to develop AI. We're probably going to render the planet uninhabitable due to greenhouse gasses, and humanity will go the way of the dinosaurs. At least that way there'll be some consciousness in the universe that knew us once, and remembers that we existed.