As a young man I went through a period of OCD fascination with all things American Civil War ("ACW"). It was the mid-1980's and Gore Vidal's "Lincoln" found its way into my hands (as a teenager I raided my mother's "library" and came away with Vidal's "Burr" which set me off as a "Revolutionary Generation" groupie).
I won't bore you with my sense of the ACW here. Suffice it to say that it conforms with about 80% of the accepted narrative. I would only remind you that history is written by the victors with little consideration paid to the "victims"- the individuals used (killed, wounded, sickened) to further the political interests of the powerful. The term "survivor bias" is usually used in business, particularly for mutual funds, but I think the term's best use would be in the description of war.
Using the popular narrative that the ACW was fought to end slavery and to further freedom, that freedom did not long last on American Soil. The decades following the ACW were followed by years of "economic recession" and stagnation. This malaise was fertile ground for the "revolution" of 1913 - the enactment of the 16th Amendment to the U.S. Constitution allowing Congress to levy a tax on income. That same year, 1913, saw the U.S. enact the Federal Reserve Act ("FRA"). This was not a coincidence.
So let me cut to the chase: The U.S. fought the ACW (accepted narrative) to further the interests of Liberty, enacts the 13th Amendment to abolish slavery in 1864, finds the following period's economic environment unsatisfactory, and re-establishes slavery with the 16th Amendment and the Federal Reserve Act.
Yea, I know... this is sort of a simple minded assertion - until you think about it. WWI and WWII (really one war with a 2 decade armistice. And what was Germany's, really all of Socialism's, motivation? To end the debt system) were fought for a great many reasons, or so our Western Intellectuals tell us (Chomsky often derides "intellectuals" as bright folks that need to make things complicated so they can sound smart by simplifying... seems to me he has a point), but to my mind these Wars were fought over bond (debt) payments - and all of the bleeding was done by folks that owned no bonds.
But back to the re-establishment of slavery in the U.S. by expanding the debt system, by enacting the income tax, and by establishing "Social Security". The U.S. KILLED well over 1 million people (those during the ACW, those that died of their wounds in the years following the ACW, the dependent women and children that died as a result of the loss of a provider that was killed in the ACW) for the purposes of ending slavery, but how else can the life of modern "Debt Slave" ("DS") be described (this is not to make light of the living conditions of Slaves in the Pre-ACW South)? True, today's DS is not subject to beatings, rape, and violence - unless he does not pay his taxes (and winds up in prison), doesn't pay his mortgage (and winds up on the street), does not pay his car payment (winds up a pedestrian in a world constructed around the automobile), or does not pay his student loans (and as a result lives a penniless existence).
OK. For those that reject that line of reasoning regarding the use of "Slave" in "Debt Slave" please feel free to use "Indentured Servant" ("IS").
And here comes the Technological Singularity ("TS"). Or maybe just TS Light. Stuart Staniford has this excellent essay on TS posted at his blog.
I have nothing to add to the discussion on what TS or TS Light means, but the exponential increase in computer processing power is going to accelerate the automation and robotization in the labor market. Yesteryear saw ATM's put bank tellers out to pasture, along with gas pump operators, telephone receptionists, and check out clerks. Today, lawyers in the paper pushing business have gotten bushwhacked. Liberal Arts Colleges and College professors are freaking doomed, (who needs to be physically present to listen to a person repeat a lecture over and over again? Can't we record the lecture once and watch it forever on youtube? For free? This is was always such horse sh#!. The history of the "lecture" in education has at its base the fact that in previous times most people were illiterate. The only way they could learn was to be lectured. Literate people can read the material of the lecture (you know, books? Essays? Dare I say "blogs"?) at their own pace and take the material where they wish at the pace that suits them, the customer/student. Technical educations will still need those pesky labs... but really, how much of your undergrad time was spent in doing labs? A couple hundred hours total? WTF do we need to sit around for 4 years for? Because these people are in business, and their business needs you to pay for "The History of Rock and Roll", "Gender Discrimination in America", "Practical Uses for Yoga" credits in order to "graduate". And why is it all or nothing? Isn't education incremental? Wall Street got rid of fractions on the 1990's. Universities can get rid of "degrees". A numerical representation of academic accomplishment would be ever so much more practical as well as accurate)
as are truck drivers, taxi drivers, pilots, ship captains, specialty surgeons, infantry soldiers (drones), police officers (cameras)... there is no end that I can see. What robots did to manufacturing they are going to do to every other sector.
The implications are freaking staggering. Consumer demand would plummet, to my mind, with the slack to be made up by government demand. Run your mind around that for a moment.
And share with me what you see.
To be continued....
Sunday, December 30, 2012
Enslavement, Freedom, Re-Enslavement, Singularity/Freedom?
Subscribe to:
Post Comments (Atom)
6 comments:
I will agree that the rise of the machines has been impressive. However...
We've had a good long run with Moore's Law. Doubling computer power every six months for over thirty years is impressive. I would disagree with Mr. Staniford's sources' claims that this can continue much longer, though. We're starting to get close to physical constraints, now.
CPU power was once measured in hertz, and if you had a co-processor. The faster the chip ran, the faster it computed, and the math co-processor did the major number-crunching to speed up the result. Since the days of the first Pentium, the co-processor has been included and active (the 486 had the co-processor included, but the SX disabled that for marketing reasons). We've hit a barrier in that no one has managed to get a chip running much faster than 4GHz, that happened a few years back. So... Intel, etc. started putting more than one CPU on each main chip, with logic on spreading the work evenly and putting them back together. The latest PC CPU has 6 cores and runs at 4GHz.
Physical size (distance) is becoming a problem, now. The thinner the traces in the CPU, the closer together we can put them, but then we must decrease the voltage to keep electrons from hopping from one track to another (easy to happen when we're working at the 15 micrometer level), increasing the amperage to keep the same electron flow needed to throw all those transistors, and now it's down to less than 1.5V at 40+Amps, which is at the physical limit for the aluminum trace to handle, as well as the heat transfer capability of the silicon matrix. Scientists are working on better materials, like copper or silver for the traces, and a diamond matrix, but that won't buy more than a few more years for Moore's Law.
As a matter of fact, we're getting close to molecular circuitry. The current hard drive uses one million ferrous oxide molecules to store one bit (yes/no) of information. IBM noted that their scientists have a working model that uses 12 ferrous oxide molecules per bit. Making single molecules do individual jobs may not be possible, but the difference between that and where we'll be in a few years is miniscule.
We'll see, but it's getting harder and harder to keep Moore's Law going.
Colleges and college professors have seen this coming for a while now. After all, the same information they charge for has been available in libraries for free for over a century. The strategies they have employed to keep their phony-baloney jobs have worked very well so far. Those include:
Credentials again, colleges hand out those degrees and have been ruthless in taking down competition. Copyright, those lectures get taken down or don't go up on youtube at all. Secretiveness, they write the textbooks and those textbooks get worse every year. You and I are used to being able to read the textbook, ignore the teacher and still understand the material. Not so easy now, the writing is convoluted and the information isn't all there. Dilution, they have watered down the primary and secondary school curriculum. This keeps folks in colleges longer, doing a year or two of pre-college courses to get the knowledge that should have been provided in high school. Win-Win!
With the government squarely behind the colleges, it will take major business action to break the current stranglehold on information. Government has managed to keep businesses from using tests to hire applicants, calling it discrimination. So far, the only real business action has been to get in on the loan money, and complain about the new workforce's capability. They have lobbyists and folks in their pockets, time to use that leverage.
Even if Moore's Law finally fails (and I agree that it will fairly soon) it could still be years before the momentum of computer-aided automation slows its effects on altering the job market. I still think that there are many jobs even our present, fairly capable processors could be doing, but haven't been hooked up to the task yet. Then too, I think that jobs are being lost to other technological improvements besides faster digital processors. I often think of the effect better building and siding materials have had, siding that now is more weather proof and lasts much longer than previously, putting painters and re-siders out of work.
Witness this momentum effect with automobiles. Cars, in terms of their basic capability, haven't really improved since the 1960s. (Basically, they now pollute less and have better sound systems and fancier electronics.) Still roads and cars continued to work their magic, encouraging sprawl, longer car commutes, and moving more people from pedestrian and mass transit commutes all up until very recently.
I likewise think automation and other technology has further exploits coming soon, even withholding further laboratory development.
Stephen B
Jobs disappearing to technology is nothing new. The general pattern is new technology replaces workers. A financial crisis ensues since the assumptions underlining the role-out have changed. Finally the supernumeraries are liquidated in a war following the financial crisis. The only thing that has really changed is the education and inelegance of the supernumeraries, today they are engineers, writers, etc. Eventually they will hit a group of people that figure out what is going on and instead of fighting each other they will rebel. That may be happening now, or maybe not. It's too soon to tell.
I had been thinking that the processor would ultimately be limited by a combination of the size of the atom and the speed of light. However reading Tweell's post has me thinking that we are essentially there now. Waste heat is proportional to the square of the current so the increase in current he describes is generating heat quadraticly. Better materials will only go so far if we keep bumping up the current. The only way to increase it from here is a jump in our understanding.
I tend to think knowledge follows an S curve for any level of understanding. At the bottom and the top of the curve discoveries are few and far between. At the bottom because we don't understand it enough to do much with it and at the top because there isn't much unexplored area left. Most of the discoveries are in the middle where the onward and upward meme holds true and things may appear linear. That is where the whole managing innovation hooey comes from. However we are at or near the top of the s curve on most fields right now. Eventually there will be new breakthroughs that put us at the bottom of new s curves. An example of what i'm talking about would be the difference between Newtons understanding of gravity and Einsteins.
Newtons understanding was sufficient to put men on the moon and send probes to other planets. We really haven’t started to use Einstein's yet but it is both a paradigm shift and closer to the truth than Newtons understanding. There are lots of people like me that can understand scientific theories and put them to use. There are substantially less people that can make new discoveries around a given theory. The great men who make new discoveries out of the blue are few and far between, there is no telling when one of them will come along and put us on a new S curve. Could be three days, could be three millennium.
Best,
Dan
P.s. The base of the lecture is a lack of books before Gutenberg, the men attending the lectures were literate but the university only had one volume of whatever was being discussed and it was either chained to the podium or chained up in the library. Illuminated manuscripts are very valuable, less so today than when they were the only option, and they are outrageously valuable today.
P.p.s. The technology for driverless vehicles are here now but outside of the military it will not be used. The first time a driverless vehicle has an accident lawfare will put paid to all that.
Dan: Thank you for the that enlightening data point about the shortage of books. Excellent.
You need to look further back in human history to actually see the roots of the dehumanization we are witnessing now. Think about the way humans lived 100,000 years ago. There were no governments, just loosely banded tribes. There were no wars because human beings needed each other as defense against predators trying to eat them. Most of human existence was spent caring for the young, escaping feline predators (which greatly outnumbered humans) and foraging for food. Not everyone had the strength and intelligence to survive.
Fast forward to present times and take a look at John Hawks blogsite (one of the leading paleo-anthropologists of this century) and you can begin to understand how the human race may be in a state of slow de-evolution and deterioration. Consider some other facts: Did you know the human brain has actually been shrinking and losing cerebral mass for the last 25,000 years? Even as recently as 3000 years ago areas of high population density results in a lower cerebral mass in the fossil brains. The reasoning is that areas of high population yield the type of social safety structures which actually lead to a loss of human function and intelligence. People who would not survive on their wits alone get by with the assistance of others. But there will always be social safety nets because most people don't want to watch other people suffer in pain and starve. Eventually the human population on the planet will diminish due to natural factors no one will be able to control. Until then the next 100 years will probably be very difficult.
Post a Comment