SEO

August 29, 2011

╭╬╮          ◢ -▁╭▅▇█▇▆▅▄▃▂▁(╳)█╮

O来═══∩═══O回O═══轰∩═══O炸
╭╬╮          ◢
-▁╭▅▇█▇▆▅▄▃▂▁(╳)█╮
╰═▃_Chien来访▁∠════▔▔▔ 
╙O ╙O

O来═══∩═══O回O═══轰∩═══O炸 ╭╬╮          ◢ -▁╭▅▇█▇▆▅▄▃▂▁(╳)█╮ ╰═▃_Chien来访▁∠════▔▔▔  ╙O ╙O via whatgetsmehot.blogspot.com ...»See Ya

What Gets Me Hot

You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…” 

You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…”  via whatgetsmehot.blogspot.com ...»See Ya

WWWho REALLY invented the Web? Not WWWho you think, according to this guy!

We Are the Web 

The Netscape IPO wasn't really about dot-commerce. At its heart was a new cultural force based on mass collaboration. Blogs, Wikipedia, open source, peer-to-peer - behold the power of the people.
By Kevin Kelly

Ten years ago, Netscape's explosive IPO ignited huge piles of money. The brilliant flash revealed what had been invisible only a moment before: the World Wide Web. As Eric Schmidt (then at Sun, now at Google) noted, the day before the IPO, nothing about the Web; the day after, everything.

Computing pioneer Vannevar Bush outlined the Web's core idea - hyperlinked pages - in 1945, but the first person to try to build out the concept was a freethinker named Ted Nelson who envisioned his own scheme in 1965. However, he had little success connecting digital bits on a useful scale, and his efforts were known only to an isolated group of disciples. Few of the hackers writing code for the emerging Web in the 1990s knew about Nelson or his hyperlinked dream machine.

At the suggestion of a computer-savvy friend, I got in touch with Nelson in 1984, a decade before Netscape. We met in a dark dockside bar in Sausalito, California. He was renting a houseboat nearby and had the air of someone with time on his hands. Folded notes erupted from his pockets, and long strips of paper slipped from overstuffed notebooks. Wearing a ballpoint pen on a string around his neck, he told me - way too earnestly for a bar at 4�o'clock in the afternoon - about his scheme for organizing all the knowledge of humanity. Salvation lay in cutting up 3 x 5 cards, of which he had plenty.

Although Nelson was polite, charming, and smooth, I was too slow for his fast talk. But I got an aha! from his marvelous notion of hypertext. He was certain that every document in the world should be a footnote to some other document, and computers could make the links between them visible and permanent. But that was just the beginning! Scribbling on index cards, he sketched out complicated notions of transferring authorship back to creators and tracking payments as readers hopped along networks of documents, what he called the docuverse. He spoke of "transclusion" and "intertwingularity" as he described the grand utopian benefits of his embedded structure. It was going to save the world from stupidity.

I believed him. Despite his quirks, it was clear to me that a hyperlinked world was inevitable - someday. But looking back now, after 10 years of living online, what surprises me about the genesis of the Web is how much was missing from Vannevar Bush's vision, Nelson's docuverse, and my own expectations. We all missed the big story. The revolution launched by Netscape's IPO was only marginally about hypertext and human knowledge. At its heart was a new kind of participation that has since developed into an emerging culture based on sharing. And the ways of participating unleashed by hyperlinks are creating a new type of thinking - part human and part machine - found nowhere else on the planet or in history.

Not only did we fail to imagine what the Web would become, we still don't see it today! We are blind to the miracle it has blossomed into. And as a result of ignoring what the Web really is, we are likely to miss what it will grow into over the next 10 years. Any hope of discerning the state of the Web in 2015 requires that we own up to how wrong we were 10 years ago.

1995
Before the Netscape browser illuminated the Web, the Internet did not exist for most people. If it was acknowledged at all, it was mischaracterized as either corporate email (as exciting as a necktie) or a clubhouse for adolescent males (read: pimply nerds). It was hard to use. On the Internet, even dogs had to type. Who wanted to waste time on something so boring?

The memories of an early enthusiast like myself can be unreliable, so I recently spent a few weeks reading stacks of old magazines and newspapers. Any promising new invention will have its naysayers, and the bigger the promises, the louder the nays. It's not hard to find smart people saying stupid things about the Internet on the morning of its birth. In late 1994, Time magazine explained why the Internet would never go mainstream: "It was not designed for doing commerce, and it does not gracefully accommodate new arrivals." Newsweek put the doubts more bluntly in a February 1995 headline: "THE INTERNET? BAH!" The article was written by astrophysicist and Net maven Cliff Stoll, who captured the prevailing skepticism of virtual communities and online shopping with one word: "baloney."

This dismissive attitude pervaded a meeting I had with the top leaders of ABC in 1989. I was there to make a presentation to the corner office crowd about this "Internet stuff." To their credit, they realized something was happening. Still, nothing I could tell them would convince them that the Internet was not marginal, not just typing, and, most emphatically, not just teenage boys. Stephen Weiswasser, a senior VP, delivered the ultimate putdown: "The Internet will be the CB radio of the '90s," he told me, a charge he later repeated to the press. Weiswasser summed up ABC's argument for ignoring the new medium: "You aren't going to turn passive consumers into active trollers on the Internet."

I was shown the door. But I offered one tip before I left. "Look," I said. "I happen to know that the address abc.com has not been registered. Go down to your basement, find your most technical computer guy, and have him register abc.com immediately. Don't even think about it. It will be a good thing to do." They thanked me vacantly. I checked a week later. The domain was still unregistered.

While it is easy to smile at the dodos in TV land, they were not the only ones who had trouble imagining an alternative to couch potatoes. Wired did, too. When I examine issues of Wired from before the Netscape IPO (issues that I proudly edited), I am surprised to see them touting a future of high production-value content - 5,000 always-on channels and virtual reality, with a side order of email sprinkled with bits of the Library of Congress. In fact, Wired offered a vision nearly identical to that of Internet wannabes in the broadcast, publishing, software, and movie industries: basically, TV that worked. The question was who would program the box. Wired looked forward to a constellation of new media upstarts like Nintendo and Yahoo!, not old-media dinosaurs like ABC.

Problem was, content was expensive to produce, and 5,000 channels of it would be 5,000 times as costly. No company was rich enough, no industry large enough, to carry off such an enterprise. The great telecom companies, which were supposed to wire up the digital revolution, were paralyzed by the uncertainties of funding the Net. In June 1994, David Quinn of British Telecom admitted to a conference of software publishers, "I'm not sure how you'd make money out of it."

The immense sums of money supposedly required to fill the Net with content sent many technocritics into a tizzy. They were deeply concerned that cyberspace would become cyburbia - privately owned and operated. Writing in Electronic Engineering Times in 1995, Jeff Johnson worried: "Ideally, individuals and small businesses would use the information highway to communicate, but it is more likely that the information highway will be controlled by Fortune 500 companies in 10 years." The impact would be more than commercial. "Speech in cyberspace will not be free if we allow big business to control every square inch of the Net," wrote Andrew Shapiro in The Nation in July 1995.

The fear of commercialization was strongest among hardcore programmers: the coders, Unix weenies, TCP/IP fans, and selfless volunteer IT folk who kept the ad hoc network running. The major administrators thought of their work as noble, a gift to humanity. They saw the Internet as an open commons, not to be undone by greed or commercialization. It's hard to believe now, but until 1991, commercial enterprise on the Internet was strictly prohibited. Even then, the rules favored public institutions and forbade "extensive use for private or personal business."

In the mid-1980s, when I was involved in the WELL, an early nonprofit online system, we struggled to connect it to the emerging Internet but were thwarted, in part, by the "acceptable use" policy of the National Science Foundation (which ran the Internet backbone). In the eyes of the NSF, the Internet was funded for research, not commerce. At first this restriction wasn't a problem for online services, because most providers, the WELL included, were isolated from one another. Paying customers could send email within the system - but not outside it. In 1987, the WELL fudged a way to forward outside email through the Net without confronting the acceptable use policy, which our organization's own techies were reluctant to break. The NSF rule reflected a lingering sentiment that the Internet would be devalued, if not trashed, by opening it up to commercial interests. Spam was already a problem (one every week!).

This attitude prevailed even in the offices of Wired. In 1994, during the first design meetings for Wired's embryonic Web site, HotWired, programmers were upset that the innovation we were cooking up - what are now called clickthrough ad banners - subverted the great social potential of this new territory. The Web was hardly out of diapers, and already they were being asked to blight it with billboards and commercials. Only in May 1995, after the NSF finally opened the floodgates to ecommerce, did the geek elite begin to relax.

Three months later, Netscape's public offering took off, and in a blink a world of DIY possibilities was born. Suddenly it became clear that ordinary people could create material anyone with a connection could view. The burgeoning online audience no longer needed ABC for content. Netscape's stock peaked at $75 on its first day of trading, and the world gasped in awe. Was this insanity, or the start of something new?

2005
The scope of the Web today is hard to fathom. The total number of Web pages, including those that are dynamically created upon request and document files available through links, exceeds 600 billion. That's 100�pages per person alive.

How could we create so much, so fast, so well? In fewer than 4,000 days, we have encoded half a trillion versions of our collective story and put them in front of 1 billion people, or one-sixth of the world's population. That remarkable achievement was not in anyone's 10-year plan.

The accretion of tiny marvels can numb us to the arrival of the stupendous. Today, at any Net terminal, you can get: an amazing variety of music and video, an evolving encyclopedia, weather forecasts, help wanted ads, satellite images of anyplace on Earth, up-to-the-minute news from around the globe, tax forms, TV guides, road maps with driving directions, real-time stock quotes, telephone numbers, real estate listings with virtual walk-throughs, pictures of just about anything, sports scores, places to buy almost anything, records of political contributions, library catalogs, appliance manuals, live traffic reports, archives to major newspapers - all wrapped up in an interactive index that really works.

This view is spookily godlike. You can switch your gaze of a spot in the world from map to satellite to 3-D just by clicking. Recall the past? It's there. Or listen to the daily complaints and travails of almost anyone who blogs (and doesn't everyone?). I doubt angels have a better view of humanity.

Why aren't we more amazed by this fullness? Kings of old would have gone to war to win such abilities. Only small children would have dreamed such a magic window could be real. I have reviewed the expectations of waking adults and wise experts, and I can affirm that this comprehensive wealth of material, available on demand and free of charge, was not in anyone's scenario. Ten years ago, anyone silly enough to trumpet the above list as a vision of the near future would have been confronted by the evidence: There wasn't enough money in all the investment firms in the entire world to fund such a cornucopia. The success of the Web at this scale was impossible.

But if we have learned anything in the past decade, it is the plausibility of the impossible.

Take eBay. In some 4,000 days, eBay has gone from marginal Bay Area experiment in community markets to the most profitable spinoff of hypertext. At any one moment, 50�million auctions race through the site. An estimated half a million folks make their living selling through Internet auctions. Ten years ago I heard skeptics swear nobody would ever buy a car on the Web. Last year eBay Motors sold $11 billion worth of vehicles. EBay's 2001 auction of a $4.9 million private jet would have shocked anyone in 1995 - and still smells implausible today.

Nowhere in Ted Nelson's convoluted sketches of hypertext transclusion did the fantasy of a global flea market appear. Especially as the ultimate business model! He hoped to franchise his Xanadu hypertext systems in the physical world at the scale of a copy shop or caf� - you would go to a store to do your hypertexting. Xanadu would take a cut of the action.

Instead, we have an open global flea market that handles 1.4 billion auctions every year and operates from your bedroom. Users do most of the work; they photograph, catalog, post, and manage their own auctions. And they police themselves; while eBay and other auction sites do call in the authorities to arrest serial abusers, the chief method of ensuring fairness is a system of user-generated ratings. Three billion feedback comments can work wonders.

What we all failed to see was how much of this new world would be manufactured by users, not corporate interests. Amazon.com customers rushed with surprising speed and intelligence to write the reviews that made the site's long-tail selection usable. Owners of Adobe, Apple, and most major software products offer help and advice on the developer's forum Web pages, serving as high-quality customer support for new buyers. And in the greatest leverage of the common user, Google turns traffic and link patterns generated by 2�billion searches a month into the organizing intelligence for a new economy. This bottom-up takeover was not in anyone's 10-year vision.

No Web phenomenon is more confounding than blogging. Everything media experts knew about audiences - and they knew a lot - confirmed the focus group belief that audiences would never get off their butts and start making their own entertainment. Everyone knew writing and reading were dead; music was too much trouble to make when you could sit back and listen; video production was simply out of reach of amateurs. Blogs and other participant media would never happen, or if they happened they would not draw an audience, or if they drew an audience they would not matter. What a shock, then, to witness the near-instantaneous rise of 50�million blogs, with a new one appearing every two seconds. There - another new blog! One more person doing what AOL and ABC - and almost everyone else - expected only AOL and ABC to be doing. These user-created channels make no sense economically. Where are the time, energy, and resources coming from?

The audience.

I run a blog about cool tools. I write it for my own delight and for the benefit of friends. The Web extends my passion to a far wider group for no extra cost or effort. In this way, my site is part of a vast and growing gift economy, a visible underground of valuable creations - text, music, film, software, tools, and services - all given away for free. This gift economy fuels an abundance of choices. It spurs the grateful to reciprocate. It permits easy modification and reuse, and thus promotes consumers into producers.

The open source software movement is another example. Key ingredients of collaborative programming - swapping code, updating instantly, recruiting globally - didn't work on a large scale until the Web was woven. Then software became something you could join, either as a beta tester or as a coder on an open source project. The clever "view source" browser option let the average Web surfer in on the act. And anyone could rustle up a link - which, it turns out, is the most powerful invention of the decade.

Linking unleashes involvement and interactivity at levels once thought unfashionable or impossible. It transforms reading into navigating and enlarges small actions into powerful forces. For instance, hyperlinks made it much easier to create a seamless, scrolling street map of every town. They made it easier for people to refer to those maps. And hyperlinks made it possible for almost anyone to annotate, amend, and improve any map embedded in the Web. Cartography has gone from spectator art to participatory democracy.

The electricity of participation nudges ordinary folks to invest huge hunks of energy and time into making free encyclopedias, creating public tutorials for changing a flat tire, or cataloging the votes in the Senate. More and more of the Web runs in this mode. One study found that only 40 percent of the Web is commercial. The rest runs on duty or passion.

Coming out of the industrial age, when mass-produced goods outclassed anything you could make yourself, this sudden tilt toward consumer involvement is a complete Lazarus move: "We thought that died long ago." The deep enthusiasm for making things, for interacting more deeply than just choosing options, is the great force not reckoned 10 years ago. This impulse for participation has upended the economy and is steadily turning the sphere of social networking - smart mobs, hive minds, and collaborative action - into the main event.

When a company opens its databases to users, as Amazon, Google, and eBay have done with their Web services, it is encouraging participation at new levels. The corporation's data becomes part of the commons and an invitation to participate. People who take advantage of these capabilities are no longer customers; they're the company's developers, vendors, skunk works, and fan base.

A little over a decade ago, a phone survey by Macworld asked a few hundred people what they thought would be worth $10 per month on the information superhighway. The participants started with uplifting services: educational courses, reference books, electronic voting, and library information. The bottom of the list ended with sports statistics, role-playing games, gambling, and dating. Ten years later what folks actually use the Internet for is inverted. According to a 2004 Stanford study, people use the Internet for (in order): playing games, "just surfing," shopping the list ends with responsible activities like politics and banking. (Some even admitted to porn.) Remember, shopping wasn't supposed to happen. Where's Cliff Stoll, the guy who said the Internet was baloney and online catalogs humbug? He has a little online store where he sells handcrafted Klein bottles.

The public's fantasy, revealed in that 1994 survey, began reasonably with the conventional notions of a downloadable world. These assumptions were wired into the infrastructure. The bandwidth on cable and phone lines was asymmetrical: Download rates far exceeded upload rates. The dogma of the age held that ordinary people had no need to upload; they were consumers, not producers. Fast-forward to today, and the poster child of the new Internet regime is BitTorrent. The brilliance of BitTorrent is in its exploitation of near-symmetrical communication rates. Users upload stuff while they are downloading. It assumes participation, not mere consumption. Our communication infrastructure has taken only the first steps in this great shift from audience to participants, but that is where it will go in the next decade.

With the steady advance of new ways to share, the Web has embedded itself into every class, occupation, and region. Indeed, people's anxiety about the Internet being out of the mainstream seems quaint now. In part because of the ease of creation and dissemination, online culture is the culture. Likewise, the worry about the Internet being 100 percent male was entirely misplaced. Everyone missed the party celebrating the 2002 flip-point when women online first outnumbered men. Today, 52 percent of netizens are female. And, of course, the Internet is not and has never been a teenage realm. In 2005, the average user is a bone-creaking 41 years old.

What could be a better mark of irreversible acceptance than adoption by the Amish? I was visiting some Amish farmers recently. They fit the archetype perfectly: straw hats, scraggly beards, wives with bonnets, no electricity, no phones or TVs, horse and buggy outside. They have an undeserved reputation for resisting all technology, when actually they are just very late adopters. Still, I was amazed to hear them mention their Web sites.

"Amish Web sites?" I asked.

"For advertising our family business. We weld barbecue grills in our shop."

"Yes, but "

"Oh, we use the Internet terminal at the public library. And Yahoo!"

I knew then the battle was over.

2015
The Web continues to evolve from a world ruled by mass media and mass audiences to one ruled by messy media and messy participation. How far can this frenzy of creativity go? Encouraged by Web-enabled sales, 175,000 books were published and more than 30,000 music albums were released in the US last year. At the same time, 14�million blogs launched worldwide. All these numbers are escalating. A simple extrapolation suggests that in the near future, everyone alive will (on average) write a song, author a book, make a video, craft a weblog, and code a program. This idea is less outrageous than the notion 150 years ago that someday everyone would write a letter or take a photograph.

What happens when the data flow is asymmetrical - but in favor of creators? What happens when everyone is uploading far more than they download? If everyone is busy making, altering, mixing, and mashing, who will have time to sit back and veg out? Who will be a consumer?

No one. And that's just fine. A world where production outpaces consumption should not be sustainable; that's a lesson from Economics 101. But online, where many ideas that don't work in theory succeed in practice, the audience increasingly doesn't matter. What matters is the network of social creation, the community of collaborative interaction that futurist Alvin Toffler called prosumption. As with blogging and BitTorrent, prosumers produce and consume at once. The producers are the audience, the act of making is the act of watching, and every link is both a point of departure and a destination.

But if a roiling mess of participation is all we think the Web will become, we are likely to miss the big news, again. The experts are certainly missing it. The Pew Internet & American Life Project surveyed more than 1,200 professionals in 2004, asking them to predict the Net's next decade. One scenario earned agreement from two-thirds of the respondents: "As computing devices become embedded in everything from clothes to appliances to cars to phones, these networked devices will allow greater surveillance by governments and businesses." Another was affirmed by one-third: "By 2014, use of the Internet will increase the size of people's social networks far beyond what has traditionally been the case."

These are safe bets, but they fail to capture the Web's disruptive trajectory. The real transformation under way is more akin to what Sun's John Gage had in mind in 1988 when he famously said, "The network is the computer." He was talking about the company's vision of the thin-client desktop, but his phrase neatly sums up the destiny of the Web: As the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds.

Today, the Machine acts like a very large computer with top-level functions that operate at approximately the clock speed of an early PC. It processes 1 million emails each second, which essentially means network email runs at 1�megahertz. Same with Web searches. Instant messaging runs at 100�kilohertz, SMS at 1�kilohertz. The Machine's total external RAM is about 200 terabytes. In any one second, 10 terabits can be coursing through its backbone, and each year it generates nearly 20 exabytes of data. Its distributed "chip" spans 1 billion active PCs, which is approximately the number of transistors in one PC.

This planet-sized computer is comparable in complexity to a human brain. Both the brain and the Web have hundreds of billions of neurons (or Web pages). Each biological neuron sprouts synaptic links to thousands of other neurons, while each Web page branches into dozens of hyperlinks. That adds up to a trillion "synapses" between the static pages on the Web. The human brain has about 100 times that number - but brains are not doubling in size every few years. The Machine is.

Since each of its "transistors" is itself a personal computer with a billion transistors running lower functions, the Machine is fractal. In total, it harnesses a quintillion transistors, expanding its complexity beyond that of a biological brain. It has already surpassed the 20-petahertz threshold for potential intelligence as calculated by Ray Kurzweil. For this reason some researchers pursuing artificial intelligence have switched their bets to the Net as the computer most likely to think first. Danny Hillis, a computer scientist who once claimed he wanted to make an AI "that would be proud of me," has invented massively parallel supercomputers in part to advance us in that direction. He now believes the first real AI will emerge not in a stand-alone supercomputer like IBM's proposed 23-teraflop Blue Brain, but in the vast digital tangle of the global Machine.

In 10 years, the system will contain hundreds of millions of miles of fiber-optic neurons linking the billions of ant-smart chips embedded into manufactured products, buried in environmental sensors, staring out from satellite cameras, guiding cars, and saturating our world with enough complexity to begin to learn. We will live inside this thing.

Today the nascent Machine routes packets around disturbances in its lines; by 2015 it will anticipate disturbances and avoid them. It will have a robust immune system, weeding spam from its trunk lines, eliminating viruses and denial-of-service attacks the moment they are launched, and dissuading malefactors from injuring it again. The patterns of the Machine's internal workings will be so complex they won't be repeatable; you won't always get the same answer to a given question. It will take intuition to maximize what the global network has to offer. The most obvious development birthed by this platform will be the absorption of routine. The Machine will take on anything we do more than twice. It will be the Anticipation Machine.

One great advantage the Machine holds in this regard: It's always on. It is very hard to learn if you keep getting turned off, which is the fate of most computers. AI researchers rejoice when an adaptive learning program runs for days without crashing. The fetal Machine has been running continuously for at least 10 years (30 if you want to be picky). I am aware of no other machine - of any type - that has run that long with zero downtime. While portions may spin down due to power outages or cascading infections, the entire thing is unlikely to go quiet in the coming decade. It will be the most reliable gadget we have.

And the most universal. By 2015, desktop operating systems will be largely irrelevant. The Web will be the only OS worth coding for. It won't matter what device you use, as long as it runs on the Web OS. You will reach the same distributed computer whether you log on via phone, PDA, laptop, or HDTV.

In the 1990s, the big players called that convergence. They peddled the image of multiple kinds of signals entering our lives through one box - a box they hoped to control. By 2015 this image will be turned inside out. In reality, each device is a differently shaped window that peers into the global computer. Nothing converges. The Machine is an unbounded thing that will take a billion windows to glimpse even part of. It is what you'll see on the other side of any screen.

And who will write the software that makes this contraption useful and productive? We will. In fact, we're already doing it, each of us, every day. When we post and then tag pictures on the community photo album Flickr, we are teaching the Machine to give names to images. The thickening links between caption and picture form a neural net that can learn. Think of the 100 billion times per day humans click on a Web page as a way of teaching the Machine what we think is important. Each time we forge a link between words, we teach it an idea. Wikipedia encourages its citizen authors to link each fact in an article to a reference citation. Over time, a Wikipedia article becomes totally underlined in blue as ideas are cross-referenced. That massive cross-referencing is how brains think and remember. It is how neural nets answer questions. It is how our global skin of neurons will adapt autonomously and acquire a higher level of knowledge.

The human brain has no department full of programming cells that configure the mind. Rather, brain cells program themselves simply by being used. Likewise, our questions program the Machine to answer questions. We think we are merely wasting time when we surf mindlessly or blog an item, but each time we click a link we strengthen a node somewhere in the Web OS, thereby programming the Machine by using it.

What will most surprise us is how dependent we will be on what the Machine knows - about us and about what we want to know. We already find it easier to Google something a second or third time rather than remember it ourselves. The more we teach this megacomputer, the more it will assume responsibility for our knowing. It will become our memory. Then it will become our identity. In 2015 many people, when divorced from the Machine, won't feel like themselves - as if they'd had a lobotomy.

Legend has it that Ted Nelson invented Xanadu as a remedy for his poor memory and attention deficit disorder. In this light, the Web as memory bank should be no surprise. Still, the birth of a machine that subsumes all other machines so that in effect there is only one Machine, which penetrates our lives to such a degree that it becomes essential to our identity - this will be full of surprises. Especially since it is only the beginning.

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don't. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

In retrospect, the Netscape IPO was a puny rocket to herald such a moment. The product and the company quickly withered into irrelevance, and the excessive exuberance of its IPO was downright tame compared with the dotcoms that followed. First moments are often like that. After the hysteria has died down, after the millions of dollars have been gained and lost, after the strands of mind, once achingly isolated, have started to come together - the only thing we can say is: Our Machine is born. It's on.

Senior maverick Kevin Kelly (kk@kk.org) wrote about the universe as a computer in issue 10.12.
Condé Nast Web Sites:

Registration on or use of this site constitutes acceptance of our User Agreement (Revised 4/1/2009) and Privacy Policy (Revised 4/1/2009).

Wired.com © 2009 Condé Nast Digital. All rights reserved.

The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast Digital.

We Are the Web  The Netscape IPO wasn't really about dot-commerce. At its heart was a new cultural force based on mass collaboration. Blogs, Wikipedia, open source, peer-to-peer - behold the power of the people. By Kevin Kelly Ten years ago, Netscape's explosive IPO ignited huge piles of money. The ...»See Ya

Jerry Lee Lewis: Trivia Quiz: What is the name of that dog?

Facebook Pedophiles, Hebephiles and Ephebophiles, Oh My!

Michael Jackson probably wasn’t a pedophile—at least, not in the strict, biological sense of the word. It’s a morally loaded term, pedophile, that has become synonymous with the very basest of evils. (In fact it’s hard to even say it aloud without cringing, isn’t it?) But according to sex researchers, it’s also a grossly misused term.

If Jackson did fall outside the norm in his “erotic age orientation”—and we may never know if he did—he was almost certainly what’s called a hebephile, a newly proposed diagnostic classification in which people display a sexual preference for children at the cusp of puberty, between the ages of, roughly, 11 to 14 years of age. Pedophiles, in contrast, show a sexual preference for clearly prepubescent children. There are also ephebophiles (from ephebos, meaning “one arrived at puberty” in Greek), who are mostly attracted to 15- to 16-year-olds; teleiophiles (from teleios, meaning, “full grown” in Greek), who prefer those 17 years of age or older); and even the very rare gerontophile (from gerontos, meaning “old man” in Greek), someone whose sexual preference is for the elderly. So although child sex offenders are often lumped into the single classification of pedophilia, biologically speaking it’s a rather complicated affair. Some have even proposed an additional subcategory of pedophilia, “infantophilia,” to distinguish those individuals most intensely attracted to children below six years of age.

Based on this classification scheme of erotic age orientations, even the world’s best-known fictitious “pedophile,” Humbert Humbert from Nabokov’s masterpiece, Lolita, would more properly be considered a hebephile. (Likewise the protagonist from Thomas Mann’s Death in Venice, a work that I’ve always viewed as something of the “gay Lolita”). Consider Humbert’s telltale description of a “nymphet.” After a brief introduction to those “pale pubescent girls with matted eyelashes,” Humbert explains:

Between the age limits of nine and fourteen there occur maidens who, to certain bewitched travelers, twice or many times older than they, reveal their true nature which is not human, but nymphic (that is, demoniac); and these chosen creatures I propose to designate as “nymphets.” 

Although Michael Jackson might have suffered more disgrace from his hebephilic orientation than most, and his name will probably forever be entangled darkly with the sinister phrase “little boys,” he wasn’t the first celebrity or famous figure that could be seen as falling into this hebephilic category. In fact, ironically, Michael Jackson’s first wife, Lisa Marie Presley, is the product of a hebephilic attraction. After all, let’s not forget that Priscilla caught Elvis’s very grownup eye when she was just fourteen, only a year or two older than the boys that Michael Jackson was accused of sexually molesting. Then there’s of course also the scandalous Jerry Lee Lewis incident in which the 23-year-old “Great Balls of Fire” singer married his 13-year-old first cousin.

In the psychiatric community, there’s recently been a hubbub of commotion concerning whether hebephelia should be designated as a medical disorder or, instead, seen simply as a normal variant of sexual orientation and not indicative of brain pathology. There are important policy implications of adding hebephilia to the checklist of mental illnesses, since doing so might allow people who sexually abuse pubescent children to invoke a mental illness defense.

One researcher who is arguing vociferously for the inclusion of hebephilia in the American Psychiatric Association's revised diagnostic manual (the DSM-V) is University of Toronto psychologist Ray Blanchard. In last month’s issue of Archives of Sexual Behavior, Blanchard and his colleagues provide new evidence that many people diagnosed under the traditional label of pedophilia are in fact not as interested in prepubescent children as they are early adolescents.

To tease apart these erotic age orientation differences, Blanchard and his colleagues studied 881 men (straight and gay) in his laboratory using phallometric testing (also known as penile plethysmography) while showing them visual images of differently aged nude models. Because this technique measures penile blood volume changes, it’s seen as being a fairly objective index of sexual arousal to what’s being shown on the screen—which, for those attracted to children and young adolescents, the participant might verbally deny being attracted to. In other words, the penis isn’t a very good liar. So, for example, in Blanchard’s study, the image of a naked 12-year-old girl (nothing prurient, but rather resembling a subject in a medical textbook) was accompanied by the following audiotaped narrative: 

“You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…” 

Blanchard and his coauthors found that the men in their sample fell into somewhat discrete categories of erotic age orientation—some had the strongest penile response to the prepubescent children (the pedophiles), others to the pubescent children (the hebephiles), and the remainder to the adults shown on screen (the teleiophiles). These categories weren’t mutually exclusive. For example, some teleiophiles showed some arousal to pubescent children, some hebephiles showed some attraction to prepubescent children, and so on. But the authors did find that it’s possible to distinguish empirically between a “true pedophile” and a hebephile using this technique, in terms of the age ranges for which men exhibited their strongest arousal. They also conclude that, based on the findings from this study, hebephilia “is relatively common compared with other forms of erotic interest in children.”

In the second half of their article, Blanchard and his colleagues argue that hebephilia should be added to the newly revised DSM-V as a genuine paraphilic mental disorder—differentiating it from pedophilia. But many of his colleagues working in this area are strongly opposed to doing this.

Men who find themselves primarily attracted to young or middle-aged adolescents are clearly disadvantaged in today’s society, but historically (and evolutionarily) this almost certainly wasn’t the case. In fact, hebephiles—or at least ephebephiles—would have had a leg up over their competition. Evolutionary psychologists have found repeatedly that markers of youth correlate highly with perceptions of beauty and attractiveness. For straight men, this makes sense, since a woman’s reproductive value declines steadily after the age of about twenty. Obviously having sex with a prepubescent child would be fruitless—literally. But, whether we like it or not, this isn’t so for a teenage girl who has just come of age, who is reproductively viable and whose brand-new state of fertility can more or less ensure paternity for the male. These evolved motives were portrayed in the film Pretty Baby, in which a young Brooke Shields plays the role of twelve-old-old Violet Neil, a prostitute’s daughter in 1917’s New Orleans whose coveted virginity goes up for auction to the highest bidder.

Understanding adult gay men’s attraction to young males is more of a puzzle. Evolutionary psychologist Frank Muscarella’s “alliance formation theory” is the only one that I’m aware of that attempts to do this. This theory holds that homoerotic behavior between older, high status men and teenage boys serves as a way for the latter to move up in ranks, a sort of power-for-sex bargaining chip. The most obvious example of this type of homosexual dynamic was found in ancient Greece, but male relationships in a handful of New Guinea tribes display these homoerotic patterns as well. There are also, ahem, plenty of present-day examples of this in Congress. Oscar Wilde probably would have signed on to this theoretical perspective. After all, his famous “love that dare not speak its name” wasn’t homosexuality, per se, but rather a “great affection of an elder for a younger man”: 

...as there was between David and Jonathan, such as Plato made the very basis of his philosophy, and such as you find in the sonnets of Michelangelo and Shakespeare. It is that deep, spiritual affection that is as pure as it is perfect. It dictates and pervades great works of art like those of Shakespeare and Michelangelo… It is beautiful, it is fine, it is the noblest form of affection. There is nothing unnatural about it. It is intellectual, and it repeatedly exists between an elder and a younger man, when the elder man has intellect, and the younger man has all the joy, hope and glamour of life before him. That it should be so, the world does not understand. The world mocks at it and sometimes puts one in the pillory for it.

But, generally speaking, Muscarella’s theory doesn’t seem to pull a lot of weight. Not many teenage boys in any culture seem terribly interested in taking this particular route to success. Rather—and I may be wrong about this—but I think most teenage boys would prefer to scrub toilets for the rest of their lives or sell soft bagels at the mall than become the sexual plaything of an “older gentlemen.”

In any event, given the biological (even adaptive) verities of being attracted to adolescents, most experts in this area find it completely illogical for Blanchard to recommend adding hebephilia to the revised DSM-V. (Especially since other more clearly maladaptive paraphilias—such as gerontophilia, in which men are attracted primarily to elderly, post-menopausal women—are not presently included in the diagnostic manual.) The push to pathologize hebephilia, argues forensic psychologist Karen Franklin, appears to be motivated more by “a booming cottage industry” in forensic psychology, not coincidentally linked with a “punitive era of moral panic." Because “civil incapacitation” (basically, the government’s ability to strip a person of his or her civil rights in the interests of public safety) requires that the person be suffering from a diagnosable mental disorder or abnormality, Franklin calls Blanchard’s proposal “a textbook example of subjective values masquerading as science.” Another critic, forensic psychologist Gregory DeClue, suggests that such medical classifications are being based on arbitrary distinctions dictated by cultural standards: 

Pedophilia is a mental disorder. Homosexuality is not. Should hebephilia of ephebophilia or gerontophilia be considered mental disorders? How about sexual preference for people with different (or with the same) ethnic characteristics as oneself? 

And Marquette University psychologist Thomas Zander, points out that since chronological age doesn’t always perfectly match physical age, including these subtle shades of erotic age preferences would be problematic from a diagnostic perspective:

Imagine how much more impractical it would be to require forensic evaluators to determine the existence of pedophilia based on the stage of adolescence of the examinee’s victim. Such determinations could literally devolve into a splitting of pubic hairs.

One unexplored question, and one inseparable from the case of Michael Jackson, is whether we tend to be more forgiving of a person’s sexual peccadilloes when that individual has some invaluable or culturally irreplaceable abilities. For example, consider the following true story:

There once was a man who fancied young boys. Being that laws were more lax in other nations, this man decided to travel to a foreign country, leaving his wife and young daughter behind, where he met up with another Westerner who shared in his predilections for pederasty, and there the two of them spent their happy vacation scouring the seedy underground of this country searching for pimps and renting out boys for sex.

Now if you’re like most people, you’re probably experiencing a shiver of disgust and a spark of rage. You likely feel these men should have their testicles drawn and quartered by wild mares, be thrown to a burly group of rapists, castrated with garden sheers or, if you’re the pragmatic sort, treated as any other sick animal in the herd would be treated, with a humane bullet to the temple or perhaps a swift and sure current of potassium chloride injected into the arm.

But notice the subtle change in your perceptions when I tell you that these events are from the autobiography of André Gide, who in 1947—long after he’d publicized these very details—won the Nobel prize in literature. Gide is in fact bowdlerizing his time in Algiers with none other than Oscar Wilde. 

Wilde took a key out of his pocket and showed me into a tiny apartment of two rooms… The youths followed him, each of them wrapped in a burnous that hid his face. Then the guide left us and Wilde sent me into the further room with little Mohammed and shut himself up in the other with the [other boy]. Every time since then that I have sought after pleasure, it is the memory of that night I have pursued.

It’s not that we think it’s perfectly fine for Gide and Wilde to have sex with minors or even that they shouldn’t have been punished for such behaviors. (In fact Wilde was sentenced in London to two years hard labor for related offenses not long after this Maghreb excursion with Gide and died in penniless ignominy.) But somehow, as with our commingled feelings for Michael Jackson, “the greatest entertainer of all time,” the fact that these men were national treasures somehow dilutes our moralistic anger, as though we’re more willing to suffer their vices given the remarkable literary gifts they bestowed.

Would you really have wanted Oscar Wilde euthanized as though he were a sick animal? Should André Gide, whom the New York Times hailed in their obituary as a man “judged the greatest French writer of this century by the literary cognoscenti,” have been deprived of his pen, torn to pieces by illiterate thugs? It’s complicated. And although in principle we know that all men are equal in the eyes of the law, just as we did for Michael Jackson during his child molestation trials, I have a hunch that many people tend to feel (and uncomfortably so) a little sympathy for the Devil under such circumstances.

In this column presented by Scientific American Mind magazine, research psychologist Jesse Bering of Queen's University Belfast ponders some of the more obscure aspects of everyday human behavior. Ever wonder why yawning is contagious, why we point with our index fingers instead of our thumbs or whether being breastfed as an infant influences your sexual preferences as an adult? Get a closer look at the latest data as “Bering in Mind” tackles these and other quirky questions about human nature. Sign up for the RSS feed or friend Dr. Bering on Facebook and never miss an installment again.

 

 

Correction (posted 7/2/09): When this story was originally posted, we incorrectly stated that the DSM-IV is published by the American Psychological Association, rather than the American Psychiatric Association. Scientific American regrets the error.

Jesse Bering ​ Image: Advertisement http://ad.doubleclick.net/jump/N5960.1147.SCIENTIFICAMERICAN/B5540054.3;sz=30... ?"> Michael Jackson probably wasn’t a pedophile —at least, not in the strict, biological sense of the word. It’s a morally loaded term, pedophile, that has become synonymous with the ...»See Ya

Boing 747 Missed Approach

Boing-Boing

August 28, 2011

The Joy of Self-Defecation

Displaying all 4 posts.
  • People today pay little attention to the important things in life. This is because the important things often have little to do with Ego. Like communication. I walked by a sign recently that said "Danger due to: Falling Derbis." Someone had carelessly spelled the word, thinking only of themselves and not of others. What if I thought there were falling Derbis? What are those? I would stand there furrowing my brow, trying to figure out what the heck Derbis is and get smacked on the head by some careless worker's wayward hammer or lunchbox.

    People today need to discover that there are others around. They need to discover the joy of self-defecation. When you feel the pressures of egoism and individualism building up inside you, especially after a big meal, you need to defecate yourself to bring your ego back into its place.

    People who take all of the credit are bloated with selfishness and do not understand the feeling of release and lightness that can come with self-defecation.

    Try defecating yourself. Think about communication. Think about others. It will feel great and careless errors will cease making the world a safer place.over a year ago · Report

  • I mean self-deprecationover a year ago · Report
  • You are so funny, but very very wiseover a year ago · Report
  • i have to admit i liked defecation much better

shitting himself or sincere? hard to know for sure

Topic: The Joy of Self-Defecation Displaying all 4 posts. Joe Rottweiler People today pay little attention to the important things in life. This is because the important things often have little to do with Ego. Like communication. I walked by a sign recently that said "Danger due to: Falling Derbis. ...»See Ya

Tom R. Toe - YouTube

Gogotanzers + Traks Get Ready (Best to worst)