Author Topic: Vanishing point: the rise of the invisible computer (The Death of Moore's Law?)  (Read 3289 times)

0 Members and 1 Guest are viewing this topic.

Offline corbe

  • Hero Member
  • *****
  • Posts: 38,354

Vanishing point: the rise of the invisible computer

 For decades, computers have got smaller and more powerful, enabling huge scientific progress. But this can’t go on for ever. What happens when they stop shrinking?

by Tim Cross

Thursday 26 January 2017 00.59 EST


In 1971, Intel, then an obscure firm in what would only later come to be known as Silicon Valley, released a chip called the 4004. It was the world’s first commercially available microprocessor, which meant it sported all the electronic circuits necessary for advanced number-crunching in a single, tiny package. It was a marvel of its time, built from 2,300 tiny transistors, each around 10,000 nanometres (or billionths of a metre) across – about the size of a red blood cell. A transistor is an electronic switch that, by flipping between “on” and “off”, provides a physical representation of the 1s and 0s that are the fundamental particles of information.

In 2015 Intel, by then the world’s leading chipmaker, with revenues of more than $55bn that year, released its Skylake chips. The firm no longer publishes exact numbers, but the best guess is that they have about 1.5bn–2 bn transistors apiece. Spaced 14 nanometres apart, each is so tiny as to be literally invisible, for they are more than an order of magnitude smaller than the wavelengths of light that humans use to see.

Everyone knows that modern computers are better than old ones. But it is hard to convey just how much better, for no other consumer technology has improved at anything approaching a similar pace. The standard analogy is with cars: if the car from 1971 had improved at the same rate as computer chips, then by 2015 new models would have had top speeds of about 420 million miles per hour. That is roughly two-thirds the speed of light, or fast enough to drive round the world in less than a fifth of a second. If that is still too slow, then before the end of 2017 models that can go twice as fast again will begin arriving in showrooms.

This blistering progress is a consequence of an observation first made in 1965 by one of Intel’s founders, Gordon Moore. Moore noted that the number of components that could be crammed onto an integrated circuit was doubling every year. Later amended to every two years, “Moore’s law” has become a self-fulfilling prophecy that sets the pace for the entire computing industry. Each year, firms such as Intel and the Taiwan Semiconductor Manufacturing Company spend billions of dollars figuring out how to keep shrinking the components that go into computer chips. Along the way, Moore’s law has helped to build a world in which chips are built in to everything from kettles to cars (which can, increasingly, drive themselves), where millions of people relax in virtual worlds, financial markets are played by algorithms and pundits worry that artificial intelligence will soon take all the jobs.

But it is also a force that is nearly spent. Shrinking a chip’s components gets harder each time you do it, and with modern transistors having features measured in mere dozens of atoms, engineers are simply running out of room. There have been roughly 22 ticks of Moore’s law since the launch of the 4004 in 1971 through to mid-2016. For the law to hold until 2050 means there will have to be 17 more, in which case those engineers would have to figure out how to build computers from components smaller than an atom of hydrogen, the smallest element there is. That, as far as anyone knows, is impossible.

Yet business will kill Moore’s law before physics does, for the benefits of shrinking transistors are not what they used to be. Moore’s law was given teeth by a related phenomenon called “Dennard scaling” (named for Robert Dennard, an IBM engineer who first formalised the idea in 1974), which states that shrinking a chip’s components makes that chip faster, less power-hungry and cheaper to produce. Chips with smaller components, in other words, are better chips, which is why the computing industry has been able to persuade consumers to shell out for the latest models every few years. But the old magic is fading.

Shrinking chips no longer makes them faster or more efficient in the way that it used to. At the same time, the rising cost of the ultra-sophisticated equipment needed to make the chips is eroding the financial gains. Moore’s second law, more light-hearted than his first, states that the cost of a “foundry”, as such factories are called, doubles every four years. A modern one leaves little change from $10bn. Even for Intel, that is a lot of money.

The result is a consensus among Silicon Valley’s experts that Moore’s law is near its end. “From an economic standpoint, Moore’s law is dead,” says Linley Gwennap, who runs a Silicon Valley analysis firm. Dario Gil, IBM’s head of research and development, is similarly frank: “I would say categorically that the future of computing cannot just be Moore’s law any more.” Bob Colwell, a former chip designer at Intel, thinks the industry may be able to get down to chips whose components are just five nanometres apart by the early 2020s – “but you’ll struggle to persuade me that they’ll get much further than that”.

One of the most powerful technological forces of the past 50 years, in other words, will soon have run its course. The assumption that computers will carry on getting better and cheaper at breakneck speed is baked into people’s ideas about the future. It underlies many technological forecasts, from self-driving cars to better artificial intelligence and ever more compelling consumer gadgetry. There are other ways of making computers better besides shrinking their components. The end of Moore’s law does not mean that the computer revolution will stall. But it does mean that the coming decades will look very different from the preceding ones, for none of the alternatives is as reliable, or as repeatable, as the great shrinkage of the past half-century.

Moore’s law has made computers smaller, transforming them from room-filling behemoths to svelte, pocket-filling slabs. It has also made them more frugal: a smartphone that packs more computing power than was available to entire nations in 1971 can last a day or more on a single battery charge. But its most famous effect has been to make computers faster. By 2050, when Moore’s law will be ancient history, engineers will have to make use of a string of other tricks if they are to keep computers getting faster.

There are some easy wins. One is better programming. The breakneck pace of Moore’s law has in the past left software firms with little time to streamline their products. The fact that their customers would be buying faster machines every few years weakened the incentive even further: the easiest way to speed up sluggish code might simply be to wait a year or two for hardware to catch up. As Moore’s law winds down, the famously short product cycles of the computing industry may start to lengthen, giving programmers more time to polish their work.

Another is to design chips that trade general mathematical prowess for more specialised hardware. Modern chips are starting to feature specialised circuits designed to speed up common tasks, such as decompressing a film, performing the complex calculations required for encryption or drawing the complicated 3D graphics used in video games. As computers spread into all sorts of other products, such specialised silicon will be very useful. Self-driving cars, for instance, will increasingly make use of machine vision, in which computers learn to interpret images from the real world, classifying objects and extracting information, which is a computationally demanding task. Specialised circuitry will provide a significant boost.


<..snip..>

https://www.theguardian.com/technology/2017/jan/26/vanishing-point-rise-invisible-computer

No government in the 12,000 years of modern mankind history has led its people into anything but the history books with a simple lesson, don't let this happen to you.

Offline corbe

  • Hero Member
  • *****
  • Posts: 38,354
   Long but interesting read that I understood, proving my (computer illiterate) ol lady right again, I am a GEEK!
No government in the 12,000 years of modern mankind history has led its people into anything but the history books with a simple lesson, don't let this happen to you.

Online Weird Tolkienish Figure

  • Technical
  • *****
  • Posts: 18,172
Fascinating. Not only is moored law slowing down but productivity growth has been stalling for quite some time.


Like the article I have felt that coding has lagged hardware for quite some time.

Online roamer_1

  • Hero Member
  • *****
  • Posts: 43,782
There are some easy wins. One is better programming. The breakneck pace of Moore’s law has in the past left software firms with little time to streamline their products

A pipe dream if there ever was one. Programming has done nothing but bloat since it's inception. imagine how fast these new machines would fly if the software they ran was still written in ASM...

But programming is designed more for convenience and sloth... Thus it gets easier, but more bloated, with every revision.

Offline montanajoe

  • Hero Member
  • *****
  • Posts: 2,324
Most people, myself included, are more like Jed Clampett then Captain Kirk when they sit down to their shiny new computer. For quite a few years now know matter how much faster the machine is supposed to be I don't perceive any real difference.

I think the real difference is in applications and the supercomputers used by government, business and others to keep track of everyone's every move :shrug:

Oceander

  • Guest
A pipe dream if there ever was one. Programming has done nothing but bloat since it's inception. imagine how fast these new machines would fly if the software they ran was still written in ASM...

But programming is designed more for convenience and sloth... Thus it gets easier, but more bloated, with every revision.

True, but that's in large part because they've been able to get away with being sloppy due in part to Moores Law.  Once that peters out, competitive pressure will require that the bloat be cut.

Online Weird Tolkienish Figure

  • Technical
  • *****
  • Posts: 18,172
A pipe dream if there ever was one. Programming has done nothing but bloat since it's inception. imagine how fast these new machines would fly if the software they ran was still written in ASM...

But programming is designed more for convenience and sloth... Thus it gets easier, but more bloated, with every revision.


Writing it in ASM wouldn't necessarily make it faster. It depends on the skill of the programmer and the algorithms written.

Online roamer_1

  • Hero Member
  • *****
  • Posts: 43,782
True, but that's in large part because they've been able to get away with being sloppy due in part to Moores Law.  Once that peters out, competitive pressure will require that the bloat be cut.

I don't think software has the ability to return to that mentality. Finding elegance in code is extraordinary in this day. the bottom line has been production for so long, and chunked-and-formed McOOP plug and play modules so pervasive, that discipline has fled. When is the last time you found a programmer that worried over the size of his finished exe, or the weight of it in mem? You're lucky enough to find //commented.

Online roamer_1

  • Hero Member
  • *****
  • Posts: 43,782

Writing it in ASM wouldn't necessarily make it faster. It depends on the skill of the programmer and the algorithms written.

Being ABLE to program  ASM denotes a level of skill nearly absent today.

I'm sorry. I am probably overly sensitive to it right now. I am a Delphi programmer, recently consulting for an IT dept on repairing some particularly glitchy code... They don't have anyone on staff that is native Pascal/Delphi anymore, and while I am no genius at it, the guys who wrote most of it were pigs.  It's probably got my dander up.

Offline Restored

  • TBR Advisory Committee
  • ***
  • Posts: 3,659
Quote
For quite a few years now know matter how much faster the machine is supposed to be I don't perceive any real difference.

Try playing games like Call of Duty and Battlefield. You will notice the difference as the games get more complex.
Countdown to Resignation

Offline Restored

  • TBR Advisory Committee
  • ***
  • Posts: 3,659
Being ABLE to program  ASM denotes a level of skill nearly absent today.

I'm sorry. I am probably overly sensitive to it right now. I am a Delphi programmer, recently consulting for an IT dept on repairing some particularly glitchy code... They don't have anyone on staff that is native Pascal/Delphi anymore, and while I am no genius at it, the guys who wrote most of it were pigs.  It's probably got my dander up.

ASM is no longer necessary. There is no point writing code that no one can read. At the speed of computers today, you could write it in COBOL and it would be fine. Any kid from India could read the code and make changes. We run a Linux mainframe and speed is determined by how you write the SQL. The program itself is wickedly fast.
Countdown to Resignation

Offline r9etb

  • Hero Member
  • *****
  • Posts: 3,467
  • Gender: Male
For decades, computers have got smaller and more powerful, enabling huge scientific progress. But this can’t go on for ever. What happens when they stop shrinking?

I remember reading an article with a thesis almost identical to this one.... back in 1979, or so.

Of course, Moore's Law isn't really a "law" at all -- it was just a smart guy's observation about circuit density, but people tend to treat it as if circuit density is the only possible approach.

Online roamer_1

  • Hero Member
  • *****
  • Posts: 43,782
Most people, myself included, are more like Jed Clampett then Captain Kirk when they sit down to their shiny new computer. For quite a few years now know matter how much faster the machine is supposed to be I don't perceive any real difference.

Largely a matter of bloat in software following the ability to get away with it... The bigger the hardware got, the bigger the software got. It isn't a direct correlation, but close enough to be generally true. Put Office 2k on a new machine, and it will fly, by comparison... And without any real  differences in features.

I found my old customized installer for DOS 7.10/Win98SE earlier this winter, and just for giggles spun it up in one of my test benches. Granted, there's a lot it couldn't talk to (no drivers available), so a lot of the machine's capabilities weren't accessible... but man, did that old girl sail. The splash-screen went by so fast you could hardly even see it.

Online roamer_1

  • Hero Member
  • *****
  • Posts: 43,782
ASM is no longer necessary. There is no point writing code that no one can read.

That's not my point.  Back in the day, spaghetti code was frowned upon, and convention demanded tight, even sparse code. A real wizard could write amazing power into positively tiny code. That discipline is what I am saying is lost.
« Last Edit: January 30, 2017, 05:40:46 pm by roamer_1 »

Offline Suppressed

  • Hero Member
  • *****
  • Posts: 12,921
  • Gender: Male
    • Avatar
That's not my point.  Back in the day, spaghetti code was frowned upon, and convention demanded tight, even sparse code. A real wizard could write amazing power into positively tiny code. That discipline is what I am saying is lost.

Exactly right.  It's disgusting how must "waste" there is, now.  You also won't find Mel these days...

http://www.catb.org/jargon/html/story-of-mel.html

The Story of Mel

This was posted to Usenet by its author, Ed Nather (<nather@astro.as.utexas.edu>), on May 21, 1983.


A recent article devoted to the macho side of programming
made the bald and unvarnished statement:

    Real Programmers write in FORTRAN.

Maybe they do now,
in this decadent era of
Lite beer, hand calculators, and “user-friendly” software
but back in the Good Old Days,
when the term “software” sounded funny
and Real Computers were made out of drums and vacuum tubes,
Real Programmers wrote in machine code.
Not FORTRAN.  Not RATFOR.  Not, even, assembly language.
Machine Code.
Raw, unadorned, inscrutable hexadecimal numbers.
Directly.

Lest a whole new generation of programmers
grow up in ignorance of this glorious past,
I feel duty-bound to describe,
as best I can through the generation gap,
how a Real Programmer wrote code.
I'll call him Mel,
because that was his name.

I first met Mel when I went to work for Royal McBee Computer Corp.,
a now-defunct subsidiary of the typewriter company.
The firm manufactured the LGP-30,
a small, cheap (by the standards of the day)
drum-memory computer,
and had just started to manufacture
the RPC-4000, a much-improved,
bigger, better, faster — drum-memory computer.
Cores cost too much,
and weren't here to stay, anyway.
(That's why you haven't heard of the company,
or the computer.)

I had been hired to write a FORTRAN compiler
for this new marvel and Mel was my guide to its wonders.
Mel didn't approve of compilers.

“If a program can't rewrite its own code”,
he asked, “what good is it?”

Mel had written,
in hexadecimal,
the most popular computer program the company owned.
It ran on the LGP-30
and played blackjack with potential customers
at computer shows.
Its effect was always dramatic.
The LGP-30 booth was packed at every show,
and the IBM salesmen stood around
talking to each other.
Whether or not this actually sold computers
was a question we never discussed.

Mel's job was to re-write
the blackjack program for the RPC-4000.
(Port?  What does that mean?)
The new computer had a one-plus-one
addressing scheme,
in which each machine instruction,
in addition to the operation code
and the address of the needed operand,
had a second address that indicated where, on the revolving drum,
the next instruction was located.

In modern parlance,
every single instruction was followed by a GO TO!
Put that in Pascal's pipe and smoke it.

Mel loved the RPC-4000
because he could optimize his code:
that is, locate instructions on the drum
so that just as one finished its job,
the next would be just arriving at the “read head”
and available for immediate execution.
There was a program to do that job,
an “optimizing assembler”,
but Mel refused to use it.

“You never know where it's going to put things”,
he explained, “so you'd have to use separate constants”.

It was a long time before I understood that remark.
Since Mel knew the numerical value
of every operation code,
and assigned his own drum addresses,
every instruction he wrote could also be considered
a numerical constant.
He could pick up an earlier “add” instruction, say,
and multiply by it,
if it had the right numeric value.
His code was not easy for someone else to modify.

I compared Mel's hand-optimized programs
with the same code massaged by the optimizing assembler program,
and Mel's always ran faster.
That was because the “top-down” method of program design
hadn't been invented yet,
and Mel wouldn't have used it anyway.
He wrote the innermost parts of his program loops first,
so they would get first choice
of the optimum address locations on the drum.
The optimizing assembler wasn't smart enough to do it that way.

Mel never wrote time-delay loops, either,
even when the balky Flexowriter
required a delay between output characters to work right.
He just located instructions on the drum
so each successive one was just past the read head
when it was needed;
the drum had to execute another complete revolution
to find the next instruction.
He coined an unforgettable term for this procedure.
Although “optimum” is an absolute term,
like “unique”, it became common verbal practice
to make it relative:
“not quite optimum” or “less optimum”
or “not very optimum”.
Mel called the maximum time-delay locations
the “most pessimum”.

After he finished the blackjack program
and got it to run
(“Even the initializer is optimized”,
he said proudly),
he got a Change Request from the sales department.
The program used an elegant (optimized)
random number generator
to shuffle the “cards” and deal from the “deck”,
and some of the salesmen felt it was too fair,
since sometimes the customers lost.
They wanted Mel to modify the program
so, at the setting of a sense switch on the console,
they could change the odds and let the customer win.

Mel balked.
He felt this was patently dishonest,
which it was,
and that it impinged on his personal integrity as a programmer,
which it did,
so he refused to do it.
The Head Salesman talked to Mel,
as did the Big Boss and, at the boss's urging,
a few Fellow Programmers.
Mel finally gave in and wrote the code,
but he got the test backwards,
and, when the sense switch was turned on,
the program would cheat, winning every time.
Mel was delighted with this,
claiming his subconscious was uncontrollably ethical,
and adamantly refused to fix it.

After Mel had left the company for greener pa$ture$,
the Big Boss asked me to look at the code
and see if I could find the test and reverse it.
Somewhat reluctantly, I agreed to look.
Tracking Mel's code was a real adventure.

I have often felt that programming is an art form,
whose real value can only be appreciated
by another versed in the same arcane art;
there are lovely gems and brilliant coups
hidden from human view and admiration, sometimes forever,
by the very nature of the process.
You can learn a lot about an individual
just by reading through his code,
even in hexadecimal.
Mel was, I think, an unsung genius.

Perhaps my greatest shock came
when I found an innocent loop that had no test in it.
No test.  None.
Common sense said it had to be a closed loop,
where the program would circle, forever, endlessly.
Program control passed right through it, however,
and safely out the other side.
It took me two weeks to figure it out.

The RPC-4000 computer had a really modern facility
called an index register.
It allowed the programmer to write a program loop
that used an indexed instruction inside;
each time through,
the number in the index register
was added to the address of that instruction,
so it would refer
to the next datum in a series.
He had only to increment the index register
each time through.
Mel never used it.

Instead, he would pull the instruction into a machine register,
add one to its address,
and store it back.
He would then execute the modified instruction
right from the register.
The loop was written so this additional execution time
was taken into account —
just as this instruction finished,
the next one was right under the drum's read head,
ready to go.
But the loop had no test in it.

The vital clue came when I noticed
the index register bit,
the bit that lay between the address
and the operation code in the instruction word,
was turned on —
yet Mel never used the index register,
leaving it zero all the time.
When the light went on it nearly blinded me.

He had located the data he was working on
near the top of memory —
the largest locations the instructions could address —
so, after the last datum was handled,
incrementing the instruction address
would make it overflow.
The carry would add one to the
operation code, changing it to the next one in the instruction set:
a jump instruction.
Sure enough, the next program instruction was
in address location zero,
and the program went happily on its way.

I haven't kept in touch with Mel,
so I don't know if he ever gave in to the flood of
change that has washed over programming techniques
since those long-gone days.
I like to think he didn't.
In any event,
I was impressed enough that I quit looking for the
offending test,
telling the Big Boss I couldn't find it.
He didn't seem surprised.

When I left the company,
the blackjack program would still cheat
if you turned on the right sense switch,
and I think that's how it should be.
I didn't feel comfortable
hacking up the code of a Real Programmer.

This is one of hackerdom's great heroic epics, free verse or no. In a few spare images it captures more about the esthetics and psychology of hacking than all the scholarly volumes on the subject put together. (But for an opposing point of view, see the entry for Real Programmer.)

[1992 postscript — the author writes: “The original submission to the net was not in free verse, nor any approximation to it — it was straight prose style, in non-justified paragraphs. In bouncing around the net it apparently got modified into the ‘free verse' form now popular. In other words, it got hacked on the net. That seems appropriate, somehow.” The author adds that he likes the ‘free-verse' version better than his prose original...]

[1999 update: Mel's last name is now known. The manual for the LGP-30 refers to “Mel Kaye of Royal McBee who did the bulk of the programming [...] of the ACT 1 system”.]

[2001: The Royal McBee LPG-30 turns out to have one other claim to fame. It turns out that meteorologist Edward Lorenz was doing weather simulations on an LGP-30 when, in 1961, he discovered the “Butterfly Effect” and computational chaos. This seems, somehow, appropriate.]

[2002: A copy of the programming manual for the LGP-30 lives at http://ed-thelen.org/comp-hist/lgp-30-man.html]

+++++++++
“In the outside world, I'm a simple geologist. But in here .... I am Falcor, Defender of the Alliance” --Randy Marsh

“The most effectual means of being secure against pain is to retire within ourselves, and to suffice for our own happiness.” -- Thomas Jefferson

“He's so dumb he thinks a Mexican border pays rent.” --Foghorn Leghorn

Offline r9etb

  • Hero Member
  • *****
  • Posts: 3,467
  • Gender: Male
That's not my point.  Back in the day, spaghetti code was frowned upon, and convention demanded tight, even sparse code. A real wizard could write amazing power into positively tiny code. That discipline is what I am saying is lost.

And back in the day, real scholars only wrote in Latin.

This sort of "hey, you kids get off my lawn-ism" is just silly.  Sure, a lot of people write bad code these days.  Just like a lot of people wrote bad code back in the "good old days."  I look at some of the stuff I wrote 20 years ago and wonder what in hell was I thinking?

Good programmers have discipline -- now, as then.


Online Weird Tolkienish Figure

  • Technical
  • *****
  • Posts: 18,172
Old fartism = compensating for the fact that you haven't had an erection in decades

Oceander

  • Guest
ASM is no longer necessary. There is no point writing code that no one can read. At the speed of computers today, you could write it in COBOL and it would be fine. Any kid from India could read the code and make changes. We run a Linux mainframe and speed is determined by how you write the SQL. The program itself is wickedly fast.

Depends on what you're writing.  If you need absolute efficiency in a particular routine, it may be written in ASM even if the overall program is written in C.

And the larger point is not that things should all be written in ASM so much as that with Moores Law hitting the wall, the emphasis will shift back to writing very clean code because ever-cheaper processing power won't be there to overcome poorly written code. 

Offline Smokin Joe

  • Hero Member
  • *****
  • Posts: 56,712
  • I was a "conspiracy theorist". Now I'm just right.
A pipe dream if there ever was one. Programming has done nothing but bloat since it's inception. imagine how fast these new machines would fly if the software they ran was still written in ASM...

But programming is designed more for convenience and sloth... Thus it gets easier, but more bloated, with every revision.
The day will come when the self-driving car tells the robocop that it "just didn't see" the guy on the motorcycle...
How God must weep at humans' folly! Stand fast! God knows what he is doing!
Seventeen Techniques for Truth Suppression

Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.

C S Lewis

Offline Smokin Joe

  • Hero Member
  • *****
  • Posts: 56,712
  • I was a "conspiracy theorist". Now I'm just right.
Old fartism = compensating for the fact that you haven't had an erection in decades
Well, the article is about Moore shrinkage... :nometalk:

Of chip sizes....
« Last Edit: January 30, 2017, 10:03:50 pm by Smokin Joe »
How God must weep at humans' folly! Stand fast! God knows what he is doing!
Seventeen Techniques for Truth Suppression

Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.

C S Lewis

Offline Taxcontrol

  • Hero Member
  • *****
  • Posts: 651
  • Gender: Male
  • "Stupid should hurt" - Dad's wisdom
I am of the opinion that the next horizon for computation will likely involve light or at least photons.  Advantages appear to be the ability to have thousands of multiple processors, lower power consumption and teraherz speed.  Down side is size.  Compared to elections, light waves are HUGE.

Oceander

  • Guest
I am of the opinion that the next horizon for computation will likely involve light or at least photons.  Advantages appear to be the ability to have thousands of multiple processors, lower power consumption and teraherz speed.  Down side is size.  Compared to elections, light waves are HUGE.

Most likely.  You're almost certainly correct on the larger point that some new, unforeseen, innovation in hardware will be required.