PC continues to decline as Intel makes job cuts

Intel is openly stating that they’re moving away from PCs and slashing jobs in their PC-related business.

Starting around 2005 I began to tell people “the PC as we know it is starting to disappear.” It was beginning to be visible around then, although I said then it was only the beginning the result was inevitable. The move toward laptops, then handhelds, then other non-PC devices and cloud computing, was already visible and the change was obviously exponential. I suspect by 2020 it will be obvious to everybody that PCs are going to be what they were back in the 1980s: a tool for specialists like scientists and engineers, and a tool and toy for hobbyists.

I myself don’t know what I’d do without a PC. I’ve been working with them since I was a kid. As a person with disabilities I always found them a tool of personal freedom and liberation. And I guess I’ll probably always have one, if I’m allowed to. It’s that last bit that worries me.

The move toward closed hardware and centralized, cloud computing and storage means we are heading back toward centralized control of computing. And my biggest worry is the loss to individual and family freedom this way.

Support independent journalism

The question people won’t ask: the public duty of publicly traded corporations

What are the ethics of an artificial entity–that is, a publicly-traded corporation? We can call the owner of a business a jerk, so why can’t we call an artificial thing like Twitter a jerk?

Publicly traded corporations are not owned by anybody. Or rather, they are owned by whoever owns stocks, and if it’s publicly traded, that means anyone can buy stock. Most such stocks are traded by computers, and most of the stocks are not held by individuals but instead are in retirement accounts and “diversified portfolios” where, typically, no one really knows all the stocks in the account except whoever runs the retirement account.

What this means is that publicly traded corporations like Twitter are effectively owned by no one. They exist solely as a legal fiction, but one that continues to do things that have real-world consequences for countless people.

In most countries, it is understood that such entities, effectively owned by no one and thus having no real accountability, are an artificial construct of the state that is there at the state’s convenience as well as the convenience of investors–but not necessarily at the convenience of customers.

It also used to be understood because of this last part–caring only about customers indirectly–that corporate entities have a duty to behave a certain way that is ethically acceptable to most people.

I will never understand those who retreat to “market” answers as if putting limits on corporate power is automatically Communism. Twitter at the moment is behaving very badly. It has banned Robert Stacy McCain, whose friends all call him Stacy. The Twitter hashtag #FreeStacy should even be trending at the moment, but Twitter has quashed it so people don’t see it. Twitter has allowed itself to be taken over by a well-known feminist ideologue who Stacy criticized.

Notice how I keep saying “it” did this. Because Twitter is a corporate entity whose human parts are all replaceable.

Is it ever OK to start asking what moral and ethical behavior looks like in a business, especially a business that owns such a large amount of the internet’s mindspace? Does it not have a duty to act in the public interest on things like political speech? If not a legally enforced one, surely one a duty we all recognize as an ethical one?

Twitter’s actions are unethical and deplorable. That’s what we should be saying.

In any case, #FreeStacy McCain!

Support independent journalism

60% of web traffic now on mobile devices

Qualify this under “general musings.”

But: the majority of web traffic is now on smartphones, tablets, etc.

I remember how much grief I used to get for saying the PC was slowly disappearing. I started saying that about 10 years ago. Whenever I said this I was accused of saying the PC was going to  “die.” No. What I said was it would slowly disappear from most people’s lives because they wouldn’t need one, and it would eventually turn back into what it was originally: a tool and/or toy for engineers, scientists, hobbyists, gamers, and nerds. That process was already visible to me in the middle of the last decade, although as with most things, this takes time, it’s no more an overnight phenomenon than any other major shift; it takes years, but the inexorable nature of exponential growth eventually becomes obvious to everyone.

I’d say we’re very nearly there. About the only thing the average person–not the engineer, the geek, the gamer, etc., the average person–needs a PC for is if they’re doing a lot of typing. How much of the population is that? A few million in the US I’d wager. Some accountants, some writers, lawyers perhaps, that sort of thing.

I’m going to enjoy the PC as a hobby again. I do wonder at times how much effort there’s going to be to actually make it difficult to make your own computer though. Linux and BSD will likely be the future there I’d think.  The turn of the next decade should be interesting to watch in that regard.

Yes, yes, I get it. You will probably always want a PC. So will I. I just honestly wondering if corporate and government entities are going to start going out of their way to make that tough.

Support independent journalism

Technology & the end of capitalism as we know it?

This video by the always-fantastic CGP Grey is more far-reaching than most of his videos, and hits on several items we’ve discussed on Dean’s World in the past. For those of you still around (I’d guess well under 100% of what this site was at one point), I would merely note that this hits on something I’ve said many times before: we do seem to be getting into territory where we no longer even understand the economy let alone have a way to fix it:

I honestly have to wonder if the “socialist revolution” envisioned by Marx won’t become a reality not because anything Marx said was right (in fact I’m inclined to think he was wrong about most things) but because we literally hit a place where there are almost no jobs left even for bright talented people, and we are faced with whether to accept a situation where all available resources are owned by a tiny fraction of the populace, or whether we start entering a Star Trek-y future in which resource scarcity is no longer what drives society and “competition” becomes an almost meaningless concept, and we have some form of socialism-by-default because the only alternative is mass revolution and complete social collapse.

The horse analogy here is much more uncomfortably close to how I see the future than a lot of my friends seem willing to even look at:

What will we do when there really are hardly any jobs requiring humans for much of anything except to say “please give me stuff?”

I’ve thought for some time now we should start thinking about these questions more. Alas, most people seem content to either watch the TV, play video games, or engage in pointless partisan left/right questions instead.

I’ve spent much of the last 10 years being told I’m crazy when I mention these things, or accused of having an ideological agenda, which has been rather crazy-making,  as I sit and watch these trends unfold and when I say, “No, I really don’t have an agenda, I really do think this is what’s going to happen and I don’t know what to do about it but we ought to be thinking about it.

Nor do I have an answer now, except to say that I’m really pretty sure that the human species is now facing questions much bigger than anything, and I do mean anything, it has ever faced before. One possiblity is dystopia. Another is utopia. Likely it’ll be somewhere between those two, but I honestly don’t know what will happen, except I think all the old economic models will pretty much collapse.

Support independent journalism

Atomic Computing

It has been possible for some years now to use atoms to represent 0s and 1s as part of a computing scheme. Getting people to believe this is actually possible and practical seems to be hard, although this new demonstration by IBM should make it clearer just how much potential there is at this level.

And if you’re sitting there thinking “yes but is it practical?” contemplate that the first major commercially sold electronic computer used 5,200 vacuum tubes, had the equivalent of about 1K of memory, weighed 13 metric tons, consumed and performed about 1,900 operations per second, and cost millions of dollars. What is just a demonstration today can become an everyday reality in the blink of an eye in historical terms.

The exponential growth curve of technology is not going to be hitting a plateau any time soon methinks.

Support independent journalism