
Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age.
The devices - whose miniaturization over time set in motion the race for faster, smaller and cheaper electronics - have been shrunk so much that the day is approaching when it will be physically impossible to make them even tinier.
Once chip makers can't squeeze any more into the same-sized slice of silicon, the dramatic performance gains and cost reductions in computing over the years could suddenly slow. And the engine that's driven the digital revolution - and modern economy - could grind to a halt.

New report from Skillsearch found that 22% of those surveyed had been laid off within the past 12 months.

It's a step forward for Stop Killing Games.

The Callisto Protocol director thinks the solution involves the right people, the right timing, and perhaps a little bit of AI
I don't agree with that. I WISH I could agree with that. But buying habits and customer opinions prove otherwise
We've seen developers in the AAA space try new things and ideas. More often than not, the customers aren't willing to give things a chance, or not enough people buy into the project for it to grow.
Creativity works better in the indie space because the budgets, pressures, and expectations aren't the same.
it's a nice idea and it worked during the PS2/PS3-era when AAA didn't cost hundreds of millions of dollars. smaller budgets and shorter development time left room for more creativity and more risk. a game didn't need to sell 4 million+ copies to break even. things are different now.
This is the guy who bragged about crunching his staff and having them work through the night. Crunch culture has lost more talent and done more damage to the industry than any other factor. Screw him.
Lol at the engine metaphor. Overly dramatic journo
and only one spec will be used to make all games equal
Meh they say this every year
although inevitable [but who knows when that happens] it is more speculation as usual
Furthermore, IBM has reached 2 Terahertz in their labs, and promises to reach that speed in room temperature (As they've already done that with 300 Ghz). Furthermore, there have been breakthroughs in silicon processors to use photonic effects in order to enhance performance between cores. Silicon processors still have plenty of life going on for them.
Besides, there's gonna be a quantic computer revolution, not to mention there are also possibilities with DNA, rendering the problem null.
Edit:
It's too bleeding edge and I don't know much, but supposedly, you can use DNA to process and store information (after all, DNA contains information), and it would supposedly be a lot faster than current processors. I don't know the details, but I bet there's more information on quantum computers than DNA computers; my bet is that the later won't even come into fruition, or who knows, perhaps a mixed of both. I've even heard about "Water computers", but about that I seriously don't know anything.