Subjective AI Results.

Banality. I don’t often use the word, I don’t often encounter the word, and it’s largely because ‘unoriginal’ seems to work better for me. That said, one of the things I’ve encountered while I play with the new toy for me, Tumblr, used it effectively and topically:

Project Parakeet: On the Banality of A.I. writing nailed it, covering the same basic idea I have expressed repeatedly in things I’ve written, such as, “It’s All Statistics” and “AI: Standing on the Shoulders of Technology, Seeking Humanity“.

It’s heartening to know others are independently observing the same things, though I do admit I found the prose a bit more flowery than my own style:

“…What Chatbots do is scrape the Web, the library of texts already written, and learn from it how to add to the collection, which causes them to start scraping their own work in ever enlarging quantities, along with the texts produced by future humans. Both sets of documents will then degenerate. For as the adoption of AI relieves people of their verbal and mental powers and pushes them toward an echoing conformity, much as the mass adoption of map apps have abolished their senses of direction, the human writings from which the AI draws will decline in originality and quality along, ad infinitum, with their derivatives. Enmeshed, dependent, mutually enslaved, machine and man will unite their special weaknesses – lack of feeling and lack of sense – and spawn a thing of perfect lunacy, like the child of a psychopath and an idiot…”

Walter Kirn, ‘Project Parakeet: On the Banality of A.I. Writing’, Unbound, March 18th, 2023.

Yes. Walter Kirn’s writing had me re-assessing my own opinion not because I believe he’s wrong, but because I believe we are right. This morning I found it lead to at least one other important question.

Who Does Banality Appeal To?

You see, the problem here is that banality is subjective because what is original for one person is not original for the other. I have seen people look shocked when I discovered something they already knew and expressed glee. It wasn’t original for them, it was original for me. In the same token, I have written and said things that I believe are mundane to have others think it is profound.

Banality – lack of originality – is subjective.

So why would people be so enthralled with the output of these large language models(LLMs), failing a societal mirror test? Maybe because the writing that comes out of them is better than their own. It’s like Grammarly on steroids, and Grammarly doesn’t make you a better writer, it just makes you look like you are a better writer. It’s like being dishonest on your dating profile.

When I prompted different LLMs about whether the quality of education was declining, the responses were non-committal, evasive and some more flowery than others in doing so. I’d love to see a LLM say, “Well shit. I don’t know anything about that”, but instead we get what they expect we want to see. It’s like asking someone a technical question during an interview that they don’t have the answer to and they just shoot a shotgun of verbage at you, a violent emetic eruption of knowledge that doesn’t answer the question.

“I don’t know”, in my mind, is a perfectly legitimate response and tells me a lot more than having to weed through someone’s verbal or written vomit to see if they even have a clue. I’m the person who says, “I don’t know”, and if it’s interesting enough to me for whatever reason, the unspoken is, “I’ll find out”.

The LLM’s can’t find out. They’re waiting to be fed by their keepers, and their keepers have some pretty big blind spots because we, as human beings, have a lot more questions than answers. We can hide behind what we do know, but it’s what we don’t know that gives us the questions.

I’ve probably read about 10,000 books in my lifetime, give or take, at the age of 51. This is largely because I am of Generation X, and we didn’t have the flat screens children have had in previous generations. Therefore, my measure of banality, if there could be such a measure, would be higher than people who have read less – and that’s just books. There’s websites, all manner of writing on social media, the blogs I check out, etc, and those have become more refined because I have a low tolerance for banality and mediocrity.

Meanwhile, many aspire to see things as banal and mediocre. This is not elitism. This is seen when a child learns something new and expresses joy an adult looks at them in wonder, wishing that they could enjoy that originality again. We never get to go back, but we get to visit with children.

Going to bookstores used to be a true pleasure for me, but now when I look at the shelves I see less and less new, the rest a bunch of banality with nice covers. Yet books continue to sell because people don’t see that banality. My threshold for originality is higher, and in a way it’s a curse.

The Unexpected Twist

In the end, if people actually read what these things spit out, the threshold for originality should increase since after the honeymoon period is over with their LLM of choice, they’ll realize banality.

In a way, maybe it’s like watching children figure things out on their own. Some things cannot be taught, they have to be learned. Maybe the world needs this so that it can appreciate more of the true originality out there.

I’m uncertain. It’s a ray of hope in a world where marketers would have us believe in a utopian future that they have never fulfilled while dystopia creeps in quietly through the back door.

We can hope, or we can wring our hands, but one thing is certain:

We’re not putting it back in the box.

Feynman’s Flower

A recurring conundrum I have had in life is the way the world demands I be technological/scientific or creative. It does not allow for both, neatly filing people into categories that can define them for life.

At least in my lifetime, it has seemed that way. At least, in my life, it has seemed that way.

If you’re creative, you’re considered irrational, emotional, and that you lack objective to those in the scientific/technological camp.

If you’re scientific/technological, you’re considered rational, distant, cold and inhumane to creative people.

The people I’ve enjoyed the most in my life have been both.

It’s not a surprise, then, that I found that someone had taken Richard Feynman’s conversation about flowers and made a video of it that suited it perfectly – one that to me was something that I always explained to my mother, the creative, but couldn’t quite get through to my father, the more engineering minded.

Enjoy it below.

Who Are We? Where Are We Headed?

We used to simply dangle from the DNA of our ancestors, then we ended up in groups, civilizations, and now that we have thoroughly infested the planet we keep running into each other and the results are so unpleasant that at least some people are renting a virtual, artificial girlfriend for $1/minute.

It’s hard not to get a little existential about the human race with all that’s going on these days with technology, the global economy, wars, and where people are focusing their attention. They’re not really separate things. They’re all connected in some weird way, just like most of humanity.

They are connected in logical ways, we like to think, but when you get large groups of people logic has an odd tendency to make way for rationalization. There are pulls and tugs on the rug under the group dynamics, eventually shaking some people free of it for better or worse.

This whole ‘artificial intelligence’ thing has certainly escalated technology. The present red dots in this regard are about just how much the world will be improved by it. We’ve heard that before, and you would think that with technology now reflecting more clearly our own societies through large language models that we might be more aware that we’ve all heard these promises before.

I can promise you that for the foreseeable future, despite technological advances, babies will continue being born naked. They will come into the world distinctly unhappy with having to leave a warm and fluid space to a colder, less fluid space. From there, they seem to be having less and less time before some form of glowing flat screen is made available to them, replete with things marketed toward them.

It would be foolish to think that the people marketing stuff on those flat screens are all altruistic and mean the best of the children as individuals and humanity. They’re trying to make money. Everyone’s trying to make money.

I don’t know that this is empirically true or not, but it seems to me that when I was a child, people were more interesting in creating value than making money. If they created value, they got paid so that they could continue creating value. It seems, at least to me, that we’ve been pretty good about removing value from the equation of life.

This is not to say I’m right. Maybe values have changed. Maybe I’m an increasingly dusty antique that every now and then shouts, “Get off my lawn!”. I don’t think I’m wrong, though, because I do encounter people of younger generations who are more interested in value than money, but when society makes money more important than value, then everything becomes about money and we lose… value.

To compensate, marketing tells people what they should be valuing to be the person that they are marketed to become.

I don’t know where this is going, but I think we need to switch drivers.

Maybe we should figure out who we are and where we want to go. Without advertising.

Earth Bound Misfits.

_earth_alone_networkedI may offend some folks with this, but it’s hard to write anything these days to do so. My intent is not to offend but to present my perspective.

I had to explain to someone that there is a difference between anti-theism and atheism itself. It’s tiresome.

Anti-theism is the akin to the political far right of atheism, finding all sorts of things to blame theism for, and with convincing and rational arguments that just don’t work on the religious.

Conversely, the religious arguments against atheism tend to be of a religious nature and only cherry pick logic.

As an atheist, I don’t care enough about religion to debate it, and I understand that for at least some people it’s a source of enjoyment. Can it be misused? Certainly, but so can just about everything else humans do. Take a look around. Once the religious stuff doesn’t adversely impact my life, I don’t care too much. A public holiday is a public holiday.

People talking about being nice to each other is something we could use more of, and if some people require religion for that, that’s fine, but withholding aid to people in a disaster zone unless they join your specific religion is just shitty salesmanship. Bad things happen with anything people are involved in and religion is no different. The same with science and technology.

The arguments on both sides tend to center around everyone wanting people to be nice to each other and fighting over how it should be done.

Now, atheism is more complicated because everyone has their own personal version. In fact, atheism is a complete lack of belief and the only reason there is a name for people who don’t believe in a deity is because the people who need deities needed something to call the others who didn’t. Personally, I prefer being called a heretic. It conjures in my mind some guy living in a cave somewhere with a rabbit bone in his beard, dancing around a fire just for the joy of it.

My version of atheism is just being awed by the world around me without the need to blame anyone or anything for it. When you drill down into the finest details of everything we know about everything around us on the planet, from plants to fungi to the thinking meatbags we thing we are, it’s simply astounding that it all exists. This is the platform from which many people’s faith springs, and I applaud them on that. I, however, understand – even believe – that we’re just an accident in a universe of accidents.

We would like to know why we’re here, but it’s peculiar that’s not the first question a baby asks. We’re sensory creatures, social creatures, and our first questions are not about the universe but on what’s practical for us. No child says, “I’d like to know about all this God business”, instead, they are sort of put on a path based mainly on geography for better and worse. I was thrown into various religions as a child and came away fairly unscathed, with no ill feelings toward the religions – but a little scared by the bad things people who are religious will do in the name of their deity of choice.

__Earth DistanceMe? I have come to an understanding with the universe which is as lopsided as everyone else’s. When I view the world, I view it as a tourist because, in the end, that’s what I am.

All the beauty, all the ugly, everything combined showing just how complicated a single world is, with we earth bound misfits constantly stretching the bounds of our knowledge through science and technology. To date, no one I know has prayed and received a better algorithm for anything, but it might happen. Who knows?

But please, don’t try to tell me that science is evil while using an app on a cellphone whose signal is bouncing off the earth through satellites, powered by harnessed electrons finding their way to ground through our mazes, as you type on a device containing very rare elements on Earth.

Incoming: The Tide of Marketing.

_google_ai_marketingBrowsing Facebook, I come across this in my feed and it’s as if they read what I wrote in Silent Bias:

…With social media companies, we have seen the effect of the social media echo chambers as groups become more and more isolated despite being more and more connected, aggregating to make it easier to sell advertising to. This is not to demonize them, many bloggers were doing it before them, and before bloggers there was the media, and before then as well. It might be amusing if we found out that cave paintings were actually advertising for someone’s spears or some hunting consulting service, or it might be depressing…

Almost on command, this shows up in the main feed on Facebook – sponsored content by Google. I haven’t used Bard, but I fear I have suffered Bard’s work because… I imagine that they used Bard to generate that advertising campaign for Bard.

The first thing that every sustainable technology has to do is pay for itself. The magnitude of this, though, is well beyond cave drawings. As it is, marketing has used a lot of psychology to get people to chase red dots.  Now that this has become that much ‘easier’ for humans, and now that it’s being marketed as a marketing tool…

How much crap do you not need? We need to be prepared for the coming tide of marketing bullshit.

The Tyranny of Stuff.

A dead planetAs I mentioned in this post, I’m fiddling around with this world and have been researching a lot of what we know about Terra Conferti, or what some of us call Earth, and in particular, the history of the self-proclaimed dominant species on the planet, which we can shorten to ‘us’.

When there were significantly less of us, we wandered around the planet. It wasn’t necessarily a great life, but we migrated where we could find food, shelter, and when those things weren’t as good as somewhere else, we wandered off.

Nowadays, we pretend to deal with this wanderlust by going to hotels in other countries which, generally, are like hotels in any other country with some distinctive and sterilized things. Experiencing the way real people live in a country isn’t really in the offing except, perhaps, some eco-tourism.

Before the Agricultural Revolution, we wandered around, found food, had sex and probably got rained on a lot. The less lucky ones got snowed on. Everyone adapted to their general areas and environments, found traditional migration patterns they followed just like many other creatures. The agricultural revolution, though, meant large populations could be supported, and with those larger populations, we got to do nifty things like find places for stuff that we could have without carrying it around.

Some of that stuff allowed us to share generational knowledge, like twig technology. Some of it, maybe even most of it, is just useless stuff that we pay rent for with space that we pay for, one way or the other.

Where once we only carried what we needed to survive, we cling to things that we want. I’m not sure how much of that is progress, but it bears some scrutiny for any sentient species.


15125228371_8d48671870_wWhen I woke up, it had the feel of a Saturday.

Not that Saturday after a hard week of work to come home and deal with responsibilities at home, like chores, or dealing with people you don’t like. That reminder of a drip here, a crack there, a place where there should be a shelf, a door creaking… things that need to be addressed which no one else would do. The toil of a Saturday.

It was like that childhood Saturday when you looked forward maybe to Saturday Morning Cartoons, and going outside for the entire day without adult supervision. That childhood Saturday evading adults to explore a friend’s tree-house, do reckless things on bicycles, catch insects, fish dirty magazines out of sewers, or play with that box of matches. A Saturday rife with experiences and glorious exploration, of risk being the reward.

And then I looked at my watch and it decided, this gift of digital technology meshed with software, that it was Thursday.

Tech isn’t all the marketing brochure said it would be in the 1980s.

Short Rant on Digital Cameras.

Trinidad and Tobago Yacht Club at Dusk.

Once upon a time, I was enthused about photography, enough so that I got a Sony Alpha 6000 when they first came out. I’m not horrible at photography; I consider myself ‘semi professional’ since I have been paid for some photos I have taken. Others I permitted use for in academic publications at no cost.

While I was out with friends a few weekends ago, I encountered a professional photographer, someone whose work I know and respect. We talked a bit about cameras, and then I griped about how I bought a Sony Alpha 6000, and the very next year the upgraded camera used different lenses – and I just never bothered after that. It’s true. And for a while I beat the snot out of that Alpha. It was fun, but then when a DSLR gets orphaned for lenses a sense of, “why bother” permeated the phoographic aspect of my soul. 

Imagine my surprise when I was agreed with.

I did the image at the top of this post with a Samsung S22 Ultra, demonstrating it’s ‘nightography’ ability which seems to be a multiple lense HDR on low lit aspects of the image. It’s nifty, cool and… relatively inexpensive on something I have to upgrade every few years anyway. A phone.

It seemed intuitive that the standalone cameras would have decreased in cost while increasing in value, yet when I did a quick perusal on I was quickly disappointed. The cameras I would consider buying are over $2,000. With how quickly my camera and lenses found their way to antiquity, I simply don’t see spending that much to be relegated to a museum that quickly. It’s ridiculous for someone who is semi-professional if one gauges that the camera should pay for itself and pay toward another before it becomes outdated. It’s not worth it. Using two sticks to start a fire with your wallet seems much more understandable.

So, for now, I’m done with stand alone cameras in much the same way I stopped getting cable television with my internet access, or when I stopped having a landline when I got a mobile phone. I just don’t see it unless you are in a position that pays to have the cameras replaced that quickly and expensively.

The way things are looking, that’s not happening soon.

Technology Is Boring

BoredomOriginally begun September 28th, 2019, and pushed out as I clean out drafts.

There’s only so many times you can write about the same things.

In the past year I’ve spent a lot of time thinking, reading and writing (and throwing that writing away), considering my own writing, looking over old pieces I wrote for magazines, for blogs, for websites, and for odds and ends. There’s a few common threads.

Most of it has been about technology, which is appropriate because my life has been supersaturated with technology – so much so that the cliche of children teaching their elders about technology does not apply to me. I have yet to have a niece or nephew show me how to do something, instead, I end up having to show things to them – including on the nefarious mobile devices. This is because, at it’s core, technology has not changed.

With technology, we’ve simply managed to change mediocrity a bit. Some call it progress, fewer call it moving backward (there are no marketing departments trying to tell you technology sucks because everyone is selling you technology). The reality is that there have been changes, but those changes are… mediocre.

Technology is boring, and yet, there are aspects we must pay attention to in the information age, from technocolonialism to data sovereignty, from censorship to privacy.

To the M1: Initial Thoughts.

83664148-1E23-47E7-A938-F6E543C2AB8AFirst, let me give some context: I have an Intel NUC 7 that has done well over the years to annoy the hell out of me. Between Windows 10 and the Nuc itself, whether it detects audio on the HDMI has been something that I could never get correct. It would simply get it right 1% of the time, and 99% of the time, I had to rely on a bluetooth speaker.

I updated the firmware, manually, because the automatic firmware updates didn’t work. I updated device drivers. I updated Windows 10. I did all these things with decreasing alacrity over the past 2 years. Booting took forever as, apparently, Windows and Intel could not quite figure out how to play well together.

I will tell anyone considering buying an Intel NUC of any sort to consider, perhaps, randomly giving yourself paper cuts across your knuckles throughout the day. It’s cheaper.

I shopped around. Ever since I did some work with a telecommunications company in Florida where Mac minis were all over their dev environment, I kept an eye on them and lo! The M1 Mac mini showed up with stuff related to neural networks. I like neural networks. They’re smarter than dumb people, generally, and what I have learned is elevating the level of stupid I deal with daily is all I can do in this world.

Nuc replacement? No. I’ll likely toss Linux on that thing where it can quietly do… something… without annoying me too much. I honestly have had such an annoying experience with that NUC that I will only sell it to someone I don’t know or like should I decide to get rid of it.

These things, of course, never show up in reviews of devices because it takes time to truly find the annoyances of any device, and you should bear this in mind as I write about the M1 – but truth be told, it has already begun to annoy me a bit.

The Mac Mini M1.

And so, I unpackaged the Mac mini M1 I had custom ordered (more RAM, more SSD, etc, because: because). I note a lot of reviewers like to talk about the packaging, and I have no idea why when it takes 5 words to communicate that: “It was hard/easy to unpack”.

It was easy to unpack.

In my mind, thoughts of someone in a sweatshop packing everything just so drifted through my mind, but I did not afford myself the luxury of that thought too much since the amount of packaging was minimal.

Next, not being an avid Apple user, I tried to turn the thing on which, of course, required me to read the 1 page documentation that came with it since in my experience there was an ‘on’ button on Mac keyboards. It wasn’t there. That took me all of 2 minutes.

It came on. It recognised the mouse and keyboard, and began the ‘new computer’ interrogation:

Where are you from?
What is your Apple User Id (or whatever the hell they call it)?
Do you want to… use this? That? The other? All 3? Just 2? Which 2?

I think the most amusing thing was that it asked me about Siri, so I set that up thinking, “Hey, did this thing come with an internal microphone?” Well, of course it didn’t, so the de facto world leaders in User Experience (UX) made a boo boo. It should have said, “Oh darn, you don’t have a microphone, no Siri for you”, instead of having me shout “Siri” at the Mac repeatedly.

And then, suddenly, we’re doing updates which are always annoying (who wants to start a new machine for updates? Nobody.) but they were relatively painless. It offered the latest Mac OSx version, Monterey, but I’m sticking with Big Sur a bit. 

And then it just… Worked. And that’s what people want. We want stuff that works. It recognised my monitor, the audio – though for some peculiar reason it didn’t save using the monitor as the default setting, something I’ll figure out in time.

So, out of the box – I like the M1.

Then comes the wonderful part of passwords from accounts, etc, which is always a hassle, but that’s pretty much done – and then, there is the adjusting of using the keyboard from the PC keyboards to the Apple keyboards.

Ctrl? No, Command, which is a key over and takes some getting used to.

And lastly, the part that’s horrid about the M1: Some applications just ain’t ready for it. Android emulators do work, but not as automatically as one would hope. Some games are hokey. The M1, being out for about a year, hasn’t grown the support outside of Apple that would make it a real contender out of the box, and I imagine the marketshare is something that doesn’t have software companies racing to compete for.

So on the software end, it’s a mixed bag – and fortunately, I have basic needs of the machine that are met which make the inconvenience bearable.

It is quiet. Creepy quiet. The silence that when it comes with a 3 year old human makes you wonder what they’re into.

Overall: It does what I need it to do, and it inconveniently does not do some things (yet?) that I want it to.