HEY! Where’s the New Technology?

Just today, Microsoft Inc. launched its successor to the ever-popular Xbox 360 console, the Xbox One. This announcement follows Playstation’s plans to launch a Playstation 4 for the Holiday season. Nintendo also released its latest console, the WiiU, earlier this year. But this announcement comes as disappointment to many, including myself. A passionate technophile and futurist, I am left asking: HEY! Where’s the new technology? And that question doesn’t stop with consoles, but expands to “new” technologies in general.

So some background in order to place my arguments to come in perspective…

In 1965, Intel’s Gordon Moore hypothesized that the number of transistors and integrated circuits would double every two years. In laymen’s terms, every two years we will see computer chips double in power while decreasing in size and cost. This has held constant for the past 48 years, and has given us amazing technologies in the process. It’s awe-inspiring to think about how we went from the cell phones of Gordon Gekko to the tablet devices like our iPhones in under 20 years! This is all due to his theory, formally known as ‘Moore’s Law.’

BUT all good things must come to an end. And even Gordon Moore himself has admitted that this “law” is not infinite. One need only consult the laws of thermodynamics to understand why. Ultimately, silicon, the basis for our microchips will reach a point where they cannot get any smaller while using optimal performance without heat-leakage. In other words, silicon chips will get to a certain point where they will not be able to handle the circuitry to carry out its processes. Experts contend this will happen by the 2020s. But futurists need not worry, there are many alternate options other than silicon, such as bio-integration of circuitry, nano-wires and other far-fetched but possibly practical ideas (currently in development).

SO, why bring this super technical rule up? Because along with Moore’s Law comes this belief that we ought to see crazy new gadgets…but in the past few years…we really haven’t.

OK, yes there’s the 3D printer and AMAZINGLY cool things on the horizon, especially in medicine. BUT for the average gadget consumer out there, like the guy who wants to buy the next Xbox, or the newest iPhone…This guy is left completely underwhelmed.

The Xbox One is just that. It’s really just technology for technologies sake. It’s this console (box) with a whole bunch of stuff wrapped into one (ah-ha I get the name now!). So I can browse the internet or watch movies and do this all with voice-activated command. But just like Siri for iPhone, these cool gimmicks wear thin quickly. And that’s because true technophiles (which is most gamers btw) don’t want technology for technologies sake, they want something that works and that they will use often. I will probably use my controller, which I actually use to play games far more often than inaccurate voice commands. Even if I want to watch Netflix, I don’t want to repeat “play Mad Men” 10x for the AI to get it right!

Often companies are so crazy about the next gadget or new toy, they miss the programming part entirely. OK, cool, so I have Siri on my iPhone, but she can’t even differentiate between my iTunes playlist, let alone find my way around Brooklyn. So we have all this new processing power, but we are advancing so fast we haven’t given programmers the proper chance to catch up. By the time a team researching natural language for artificial intelligence creates an algorithm for a particular device or platform, its probably obsolete, now they have to rework it for another device. And that gets SUPER expensive! We haven’t given them a financial incentive to invest the millions into a code that works. And that’s because companies are releasing software and hardware to fund THE NEXT THING. I.e Vista was just a way for Microsoft to test Windows 7 and fund its investment…Still while not realizing full potential of its devices.

Instead, we have opted for empty hulls in many instances of late. We have these fast, new devices. But do you really notice a difference between your iPhone 4s or 5? The average person won’t, in fact, it’s probably only a couple seconds difference in speed. And 4G isn’t fully developed still in many areas of the country.

The way to impress people again is to not just give someone a NEW device. Consumers have become a lot more sophisticated. They crave more than just the NEXT thing. They want something innovative. The first generation iPhone was amazing because for the first time we were using a friggin’ computer all by touch…and it could also make phone calls. Technology began to consolidate our lives via devices into one convenient device. This was mind boggling and amazing. The same happened with Xbox. Microsoft revolutionized the way gamers behaved. It created a social community to take advantage of its new improved hardware over its LIVE service. Gamers could now connect via a sort of social network and play with one another and share content.

Today, we don’t have as much innovation along with our new devices. And when we do, the technology is often in its infancy (like voice recognition and AI) and looses its appeal quickly because its BLOODY FRUSTRATING. So where’s the new technology is the question I am left asking. Why should I break my contract for an iPhone 5 (bc it’s thinner and bigger?!). That seems like a stupid reason to spend $700 to me. Or why spend $300 on a new console when my old one does the exact SAME things (bc Xbox Kinect is on system? – who cares, Kinect games SUCK, give me HALO!). 360 and PS3 are still VERY solid pieces of hardware. The concentration should be on software development! *hits head against wall*

Eventually as Moore’s Law slows down and we approach the need to move beyond Silicon for our processing power (The Post-Silicon Era I call it) we will see programmers catch up. By all expert consensus, as we reach a limit in the Silicon era, we will see the existing hardware be put to FULL use. I.e. I have 4 cores of processing power on my MacBook Pro…I barely ever need to use even two… So I am optimistic that technology for technologies sake is coming to a close. We will start seeing AMAZING breakthroughs in programs which take full advantage of existing processing power and Moore’s law. And I for one look forward to it!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s