Questionable things

It’s November 2019, which, as all sci-fi fans and film buffs know, is the month when the events of Blade Runner take place.

We wrote about Ridley Scott’s dystopian masterpiece back in the early days of this blog, when 2019 still seemed like the semi-distant future, and, while I did have an inkling that the decade to come was going to be a bit grim, the way things have turned out in reality makes Rick Deckard’s neo-noir Los Angeles look quite attractive in comparison, despite the perpetual rain, and the homicidal robots.

Interestingly, Do Androids Dream of Electric Sheep?, Philip K. Dick’s 1968 novel which provides the source material for Blade Runner, is in many ways a more accurate portrait of 21st century life. It envisages San Francisco in 2021 (or 1992 in the earlier editions); the elite have long since fled to off-world colonies, leaving ordinary citizens struggling to survive in a world overtaken by ecological catastrophe and drowning in the detritus of a collapsing civilisation, their lives ruled by unaccountable corporations in a brutal police state, finding solace only in technological simulation of lost nature, and bogus virtual-reality religion.

The book and the movie do share a common theme about the nature of humanity, but the former is significantly darker, and much more downbeat in its conclusion. Dick died shortly before the film came out, but he did see a pre-release version, and apparently liked it, though he felt it complemented his story rather than directly reproducing it.

While android technology may not have advanced as far as Dick imagined, the cleverness of today’s Artificial Intelligence does seem to exceed that displayed by the replicants in the story. Roy Batty may trick his way into Tyrell’s residence with an unexpected chess move (though he’s actually just reproducing a game played out by humans back in 1851), but chess is old hat for modern AI; just last month it was reported that Google’s Deep Mind program had mastered that most advanced of intellectual pursuits, the online real-time strategy game.

Some people warn that AI is approaching the Singularity; the point where it can improve itself faster than humans can keep up. This is generally followed, in classic science fiction at least, by the newly-conscious super-computer taking over the world, though this does depend on humans doing something stupid, like handing it control of all the nuclear weapons, and it usually all works out well in the end, once we manage to teach the machines the power of love or something.

I do sometimes worry that AI will kill us all eventually, though not with an army of cyborgs; it will just get us to do the job ourselves, by using social media algorithms to divide us into mutually destructive tribes, or, failing that, to convince enough of us to eschew vaccination that we all die of measles.

At heart though, I’m still enough of a techno-utopian to believe that humankind is sufficiently smart to stay in control of the technology we create, and that our social organisation will evolve to allow the whole population to benefit from the advances that, at the moment, are just enriching a few. All going well, the future will be less like Blade Runner, and more like … actually I can’t think of a sci-fi film where the utopia doesn’t turn out to be a dystopia before the second reel. Maybe Logan’s Run, for the under-30s?

Information overload

Today is apparently the 25th anniversary of the World Wide Web (again), which seems like a good excuse to reflect on the future of our own little contribution to the medium, namely this blog.

One of the many statistics that I’ve seen bandied about today is that, of the billion or so web pages in existence, around 75% are inactive. I’m not sure exactly what the definition of “inactive” is in this context, but I’d have to admit that SLS, with our relaxed update schedule, must be at least flirting with it.

Why is this? It’s not as if I’ve become any less opinionated in the last couple of years, and there’s certainly no shortage of subjects to comment upon. If anything that’s the problem; the sheer deluge of information, handily delivered at all hours of the day, means I never have time to stop and write about anything before I’m distracted by the next item on the timeline. Even when I do pause long enough to start to formulate some thoughts on a subject, I tend to be discouraged by the feeling that someone else will undoubtedly have already expressed them, and probably more eloquently, a suspicion that I can usually confirm with a couple of clicks.

Would it really matter if this blog slipped into a permanent limbo? To the world, I guess not, but I would feel more than a twinge of regret. I enjoy reading our old posts, and it would seem like a shame to give up just when we are closing in on our tenth birthday.

So what’s to be done? I have to drop back to a slower pace of news acquisition; perhaps I should start reading actual papers again, instead of addictively clicking on a Facebook feed. I could get rid of my smartphone and attempt to wean myself off of the need to be constantly connected. I might even try just hanging out with my friends and talking about stuff, like we used to do back in the old days.

Oh, who am I kidding? I’ve tasted the sweet drug that is the modern internet, and I’m not about to give it up. I’ll just have to try harder not to be so passive…

This “Internet” thing might catch on

It’s a quarter of a century since Tim Berners-Lee submitted his proposal for what became the World Wide Web; I got on board in 1996 via a 14.4 modem, Compuserve, Netscape Navigator and Geocities, but it wasn’t until May of 2007 that the medium finally reached its full potential with the debut of Second Life Shrink. I think it’s fair to say that there have been no significant developments in online culture since then, but we’re working on it…

CD

So here we are at post number 400. Looking back over the six and a half years it has taken us to get this far, I can’t avoid noticing that we have strayed somewhat from the purpose we outlined in our very first post:

My intention is … to wander around the likes of Second Life and report back on what I find, enlightening readers with erudite comments on the interaction that occurs there.

Regular readers will recall that the main reason for our recent lack of SL-related content is that my desktop computer is far too ancient to run the current iteration of the viewer. It’s about 18 months since I resolved to get a new(er) box, but I haven’t got around to it yet, partly because I’m too cheap to buy a brand new machine, and too lazy to order and fit the parts to upgrade my old one, but mostly because I never actually use my desktop these days, as my IT needs are all satisfied by my smartphone, from the comfort of my couch.

I had been waiting for Linden Lab to release an iPhone viewer, but there were no signs that was ever going to happen, so last week I finally lost patience, bought myself a cheap Android tablet, installed TPV Lumiya, and got myself back on the grid:

image

This set up is less than perfect; although Lumiya does have a fairly decent 3D mode the draw distance isn’t great, and it tends to slow down alarmingly if there are more than a couple of other people about. It’s hard to go to specific places too, since it isn’t possible to type in coordinates directly; instead one has to acquire and click on an SLURL via the web, which is a bit of a hassle. (Of course I haven’t bothered to RTFM, so there might be an easier way to get around; if anyone knows, please enlighten me.)

Nevertheless we are, potentially, back in the virtual world business; look out for some SL updates in the weeks ahead, before my attention inevitably wanders…

Wasted Youth

Today (or actually yesterday, since, in true slacker fashion, I haven’t got round to posting this until after midnight), was the 30th anniversary of the launch of the ZX Spectrum. Unsurprisingly, the internet has been awash with articles by 40-something guys fondly recalling long hours spent honing their programming skills on the iconic machine, and pitying later generations, who may have iPads and Twitter and what have you, but missed out on the character-building experience of wrestling with a rubber keyboard to produce the 8-bit classics that founded the video-game industry.

I have to admit that I was one of those sad cases who spent too much time alone in my bedroom typing code, when I should have been out engaging in healthier youthful pursuits, like smoking, drinking, or committing acts of petty vandalism. It doesn’t seem to have done me any harm in the long term though, as long as you don’t count my ongoing tendency to stay up all night blogging about obsolete computers.

Planned obsolescence

I have a somewhat ambivalent relationship with cutting-edge technology; in theory I am in favour of keeping bang up to date, but in practice I find myself hanging on to old gadgets long after they should have been consigned to the recycling bin.

It’s only fairly recently that I got an LCD TV, after spending years squinting at a vintage 14-inch Sony Trinitron, latterly augmented with a digital tuning box (and an RF-modulator, since it didn’t have a SCART socket) when they turned off the analogue signal. I would still have it today, but it stopped working, and I couldn’t find anyone willing to even look at it, never mind fix it, though it was probably a simple enough job.

This state of affairs is mostly due to a combination of laziness and stinginess – I’m driving around right now in a car with two broken mirrors and a busted heater, because I resent paying the inflated fee the mechanic would charge me for swapping a couple of parts, but I can’t be bothered going down to the scrapyard to get the bits myself – along with a high tolerance for imperfection; if something isn’t actually going to kill me I can usually put up with it. That’s not the whole story though; despite being avowedly anti-conservative there is a large part of me that is resistant to change. Jobs, cities, relationships; I’ve stayed in them all long after it would have been sensible to leave. This is probably down to a subconscious fear of death or something; I should perhaps try to work through it in therapy, but I guess it has saved me a lot of money over the years.

All this is a roundabout way of explaining why there hasn’t been much in the way of Second Life content in this blog recently. When I last downloaded an updated version of the viewer (which was a while ago, so it’s not even the latest one), it had the not entirely unpredictable effect of slowing my venerable desktop box to a crawl, making my SL experience even more tiresome than usual. I suppose that I should try using some nimble third-party viewer, but the task of identifying one that is both reliable and linux-friendly seems like too much of a drag right now, and anyway the Lindens seem to be freezing out the TPV developers, so it would probably only be a temporary fix.

Thus I have reluctantly come to the conclusion that my trusty 12 year-old 1.6 GHz P4 has reached the end of the line, and that I need a new computer. The simplest solution would be to buy a ready-built machine, but I want to reuse as many components as possible, and the case, keyboard, mouse, monitor, hard drives and optical drive are all perfectly serviceable, so I think I’ll go down the DIY route.

I’ll need a new motherboard, processor, RAM, and a graphics card (I’d keep my not-too-ancient nVidia Ge Force 7-series, but it’s got an AGP plug). I’d really like an Intel I-7, but they are rather expensive, so I’ll probably settle for an I-5, which should do me for a few years; processor/motherboard/memory bundles can be had for between £200 and £300. Add in a GeForce 500 card at about a ton, and that’s a fairly nifty system for under £400.

On the other hand, who uses a desktop computer these days? I could take the money and buy a new iPad, which would do for 90% of my computing needs, pretty much everything except Second Life in fact. I do like to have a big hard drive to keep my data on, since I’m far too paranoid to trust the cloud, but I don’t need a fancy new processor or graphics card for that.

Still, I guess my inertia will keep me from wholeheartedly embracing the new paradigm of mobile computing, and I probably will end up trying to rejuvenate my old desktop. I doubt I’ll get round to it much before the summer though, so this blog will remain misleadingly named until then at least.

 

Steve Jobs R.I.P.

I’d love to say that my first-ever computer was an Apple II, but it wasn’t, it was a ZX Spectrum; I just fantasised about having an Apple II, which seemed like a properly futuristic machine when I read about it in Omni magazine in the 1980s. I never actually got round to buying an Apple desktop, even when I had the money; at some point I was seduced by the counter-cultural charms of Linux, and have stuck with that ever since.

I do however have an iPhone, and I think I can say without much exaggeration that it has changed my life. I guess that an Android smartphone would have had the same effect, and preserved my open-source purity, but Apple got to me first, and, at this point in my life, I can’t be bothered with the dislocation of changing software ecosystems.

I am planning to get an iPhone 4S, despite the lukewarm reviews, since my current model is a 3G, which is getting to be embarrassingly clunky. It’ll be interesting to see if Apple’s products can retain their cachet now that Steve Jobs is gone, or if people will finally notice that everyone and their granny has an iPhone or an iPod, and Apple are just another producer of (somewhat overpriced) consumer commodities.

Memory Lapse

About a week or so ago I was idly browsing the web when I was suddenly struck by the thought that I should buy some more memory for my desktop computer. I must have been thinking it would speed it up, or let me run Viewer 2.0 properly or something. Anyway, I surfed over to eBay to look for some of the DIMMS that my elderly box needs. Readers may recall that the last time I tried this I went to the auction site after having a few drinks, and ended up buying the wrong kind, so I was careful to only bid on PC133 modules, and managed to pick up 1GB for what seemed like the bargain price of £8, including delivery.

The chips arrived in the post today, so this evening I set to work installing them, which was a lot more bother than I remembered. The memory slots are partially blocked by the video card and obscured by the hard drive cables, so I had to take the machine half to pieces to get the modules in. Once everything was back together I switched the power on, but of course all I got was the triple beep of the BIOS “Your memory is screwed” signal.

Some online research soon revealed that there there is a subtle but crucial difference between the PC133r memory I had purchased and the PC133u memory that my machine was expecting. After a further bout of dismantling and rebuilding I had my old configuration back working again.

I guess I’m only down a few pounds and a couple of hours, and I have learned something, albeit only some almost completely useless knowledge about the inner workings of a long-obsolete computer, which will probably crowd something more valuable out of my brain. The lesson I’m drawing from this is that change is to be feared, and I should just be happy with what I’ve got.

I Wanna Wanna get the New Model

I had never bought any Apple product before I got my iPhone last year; partly because I didn’t want to betray my open-source principles, but mainly because I was too cheap. I’ve been very impressed with the gadget, but I haven’t been completely converted to the cult of Jobs, and if and when someone comes out with a better smartphone I’ll probably switch, especially if it runs on Android or some other free OS.

I was intrigued rather than obsessed by the fuss surrounding the launch of the latest Apple toy, the already legendary iPad, but now that the details are public I have to confess that I am tempted to buy one, though there is absolutely no way that I could justify the expense, since I would only ever use it in the house, and I already have a desktop and four laptops of varying degrees of obsolescence lying around, as well as the iPhone.

What’s most attractive about the iPad is the promise of instant access to the web, allowing me to fill spare moments with casual browsing, much in the way that I use my iPhone now, but without the eye-straining tiny screen. I have a feeling that I will get one eventually, but I’ll leave it a while to see if any cheaper clones come out.

Or maybe I should just buy some more books, or start talking to people, or something crazy like that.

Here’s the horribly contrived (even by our standards) but musically impeccable video link.

Command-X, Command-C, Command-V

On a more positive note, Cut and Paste has come to the iPhone! So now I can post all those interesting links in my blog without having to resort to writing them down on a piece of paper! Maybe this new media world isn’t so bad after all.

%d bloggers like this: