As if there needed to be another reason to be wary of chat rooms geared toward meeting people and having flirtatious, cyber-relations with them, doing so can now put you at increased risk of identity theft. CyberLover.ru, a new site out of Russia, boasts that buyers of its software will be able to trick unsuspecting marks into handing over their personal information. CyberLover.ru's sexy bot can allegedly drum up salacious conversations using 10 different personalities that are so life-like that the victims will hand over their photos, phone numbers, and more at the drop of a negligee. The program can also be tailored towards either gender, and be used to obtain other forms of data, says the company.
Security software company PC Tools warns that the bot can easily be used for malicious purposes. The company said that the program's ability to mimic human behavior to dupe chatters is worrisome, and could readily be used to collect all manner of information. "As a tool that can be used by hackers to conduct identity fraud, CyberLover demonstrates an unprecedented level of social engineering," said PC Tools senior malware analyst Sergei Shevchenko in a statement. "CyberLover has been designed as a bot [robot] that lures victims automatically, without human intervention. If it's spawned in multiple instances on multiple servers, the number of potential victims could be very substantial."
The bot is able to simulate a number of different personalities, ranging from "romantic lover" to—this is not a joke—"sexual predator" (Mmm, I know that one really gets me going on those cold, lonely Friday nights). Once it collects the personal data of whoever it is chatting with, the bot then stores it all and sends it to its owner. PC Tools warns that the bot also lures lonely users to visit a website or blog, "which could in fact be a fake page used to automatically infect visitors with malware."
The creators of CyberLover, however, deny that the bot is intended for anything other than providing lonely chatters with a few thrills. "The program can find no more information than the user is prepared to provide," an employee identified as Alexander told Reuters. "If you have someone who is ready to hand over secret information to the person they are chatting to after having known them for all of five minutes, then in that case a leak of information is possible."
Alexander may have a point, but that doesn't make the data collection any less disconcerting. Regular chatroom-goers should protect themselves by using aliases online and avoiding giving out too much personal information to strangers—no matter how dirty they talk to you.
Google has surely noticed that much of its search traffic is directed to Wikipedia, which regularly has an entry in the top five search results for any particular term. If Google could steer all that traffic toward its own properties instead, and if those properties contained Google ads, and if Google split its revenue with the article creators… well, it's not hard to see why this would start to look pretty good to both Google and content creators, and whysuch an initiativecould ramp up quickly.
Udi Manber, Google's VP of Engineering, announced just such a planlast night, a program that (in his words)will make it easier for those with knowledge to share it with the world. The system is called "Knol"—which refers to a "knowledge unit"—and it will let anyone create, edit, and profit from creating a page packed with information on a specific topic. In other words, Google doesn't just want to link Wikipedia, it wants to be Wikipedia.
For a company that got its start by bowing at the Altar of the Algorithm, bringing human-created content in-house is the most recent manifestation of a paradigm shift that has been in the works for the last few years now, one that hasn't been happening without controversy. With the announcement of Knol, Google is already inviting questions about whether its reach has now extended too far.
Land of the knols
The basic point behind the knol system is to highlight (and provide incentives for) authors—a direct shot at the anonymity of Wikipedia and other Web 2.0 systems that don't allow experts to stress their own credentials when posting.
Each knol (it's the name of both the pages and the service) is just a web page hosted by Google. It has a special layout, one generated by Google-supplied tools, that includes content, links, and an author biography.
A sample knol page
Each knol is controlled by the author who creates it. While strong community tools for suggesting changes, making comments, and ranking knols will exist, it's up to each knol's author to control the contents of the page.
Google will host the content but will not attempt to edit or verify it, instead trusting that the best knols will naturally rise to the top (a single topic can have multiple knols, each competing for higher placement in Google's search results.)
Essentially, Google is offering to let people rebuild Wikipedia, and it seems to be targeting two classes of users: 1) experts who may not all feel welcome in Wikipedia, where their actions carry no special weight, and 2) those who aren't keen on spending their free time contributing to Wikipedia without compensation. While Wikipedia itself is diverse enough to survive, smaller projects like Citizendium could find the going much tougher.
You say you want a revolution? Well…
The Knol project is, in one sense, as nonrevolutionary as they come. Making information pages simple to develop? Ranking those pages? Monetizing those pages? Google itself does all three things already on the web through tools like Blogger, Google Search, and AdSense. Essentially, Google is just rolling out a new set of web page creation tools with a single template to work on.
Google's professed interest in making it easy for people to put information on this thing called "the Internet" might have rung true in 1998, but that simply can't be the reason for Knol in 2007. It's already too easy. Wikipedia makes it simple. So do blogging tools.
Instead, Google wants to mount a direct challenge to various social knowledge sites. Although it won't have an exclusive license to the content created for Knol, and though it will offer Knol pages to be indexed by all search engines, it's clear that Google really wants to be in control of a vast, Wikipedia/Citizendium knowledge store. And it can offer something that Wikipedia, et al., cannot: cash.
AdSense and its discontents
The revenue sharing bit is one of the keys to the whole project. Google is going to let authors choose if they want to include Google ads on their knols. The truly altruistic might say no. Most people will say yes.
And that's where things could get ugly. The lure of filthy lucre is likely to force several changeson the community model of current social knowledge projects. For one, it will break the community-oriented, we're-all-working-on-this-together spirit of sites like Wikipedia. With Knol, we're not in this together; we're in competition. Writing a knol on a popular topic could become a cash cow, as Google promises to split ad revenue with the author.
Many different authors can take a shot at creating a knol on the same topic, which should allow the best pages to claw their way to the top in a sort of survival of the fittest. But the thing about intellectual Darwinism is that it can be vicious, and we expect the same to be true of competition for the top knol spots.
Will Google be the one to police the inevitable claims of plagiarism? Will it do anything when a knol rips off pictures from another knol? What happens when Wikipedia gets ripped off or rewritten? Google is famously loathe to intervene manually, but when the company is creating an ecosystem that rewards individuals and puts so much cash on the table, problems are sure to result.
Maybe Google can be evil
The blogosphere reaction has already been electric. Even those likely to give Google the benefit of the doubt when it comes to not being evil are having second thoughts. What possible reason does the company have for moving beyond indexing and into the hosting and control of this sort of content?
Actually, Google has been making these moves for years. Google Book Search, Google Video, and YouTube are only the highest-profile examples of the way that Google has moved far beyond its roots in pointing people to other places on the 'Net.
Social knowledge, as exemplified by the high search placement of Wikipedia articles and the growth of sites like Mahalo, has been high-profile for long enough to earn a spot on the Google strategic radar screen. Despite the idealistic sentiments about ease of knowledge production, Knol looks more like an attempt to kneecap various sites that now command a good chunk of Google's outgoing search result links.
With Google having a vested interest in knols, but also being the main search engine that will index and rank those links, many people already suspect a conflict of interest. While we suspect Google will be careful not to give a special boost to knol results (at the risk of ruining user confidence in its results), others aren't so sure. At the very least, it will create suspicion.
Om Malik argues that this is just "Google using its page rank system to its own benefit. Think of it this way: Google's mysterious Page Rank system is what Internet Explorer was to Microsoft in the late 1990s: a way to control the destiny of others."
TechCrunch wonders if this is "a step too far." Knol "brings the power of Google into a marketplace that is already rich with competition," writes Duncan Riley, "and a marketplace where Google can use its might to crush that competition by favoring pages from Knol over others, on what is the world's most popular search engine."
And Danny Sullivan of Search Engine Land says, "It begins to feel like the knowledge aggregators are going to push out anyone publishing knowledge outside such aggregation systems."
This can't be the reaction that Google was hoping for with its announcement, but it may not matter. The naysayers can do their naysaying, but we suspect that the prospect of cash, combined with the competition for top spots in the Knol hierarchy, will lead to plenty of quality content at a rapid clip. Whether that's a positive development for the web is another question.
Pulse~LINK, one of the many entrants in the wireless HD technology race, has announced a new, ultra-wideband-based chipset that it claims can outdo the competition. According to an independent performance comparison (PDF) conducted by the EE Times and released by Pulse~LINK, the company's UWB implementation, called CWave, delivers sustained close-range performance that's more than 20x higher than its next-closest competitor. Specifically, Pulse~LINK promises between 480Mbps and 890Mbps, depending on transmission range.
Pulse~LINK offers a partial explanation of this performance gap between its own CWave product and its WiMedia-based compeitors, claiming that while other manufacturers chose to use the PC-centric wireless USB (W-USB) protocol and moved away from the goal of offering HD content over a wireless connection, Pulse~LINK stayed focused exclusively on wireless HD. If successful, the CWave chipset could find a home in consumer electronics, thereby enabling an HD DVD or Blu-Ray player to wirelessly transmit a signal in HD resolution to a television or display device located in a different area of the home or even in another room.
According to the performance comparison, the CWave is capable of delivering up to 890Mbps of throughput at very close range, dropping to around 480Mbps at a distance of eight feet. That's still much faster than the study's reference wired USB transfer rate of ~160Mbps, but the CWave's speed continued to drop rapidly as range increased until the two devices were approximately 12 feet apart. At that point, Pulse~LINK's chipset maintained a stable transfer rate just below 120Mbps all the way out to 35 feet. Performance begins to dip at the 40-foot mark, but the study's authors say they were out of room at that point, and were unable to test greater ranges. By comparison, the top-rated devices from CWave's competitors topped out at 50Mbps and saw even that transfer rate fall as devices approached the 30 foot mark.
Pulse~LINK has yet to state when we might see shipping products based on CWave, how much they might cost, and how difficult it would be for a standard home user to deploy the company's UWB technology across a home. Building a standard that can interface over existing wire actually eliminates some of the range concerns surrounding UWB implementations, but devices will have to price comparably against 802.11n products in order to catch the eye of the mass market.
The number of companies interested in HD-over-wireless is significantly larger than Pulse~LINK implies, and the study the company quotes doesn't actually compare CWave against any products from these other companies. TZero and Analog Devices partnered to launch a 480MBps-capable wireless HDMI system in September 2006, Samsung is developing a range of 50" and 58" wireless HDTV's that broadcast at 1080p using 802.11n, and a number of industry players including LG, Sony, Samsung, and Toshiba formed the WirelessHD Consortium back in October of 2006, with the goal of developing and marketing a wireless HD solution. Pulse~LINK's CWave may actually be a superior solution, but the field is considerably more crowded than the company's comparison study indicates.
The release candidate of Vista SP1wasreleased to the general public just a few days ago, and many fans are still in the process of downloading and installing it on their systems. As expected, there areplenty of bug fixes and general improvements bundled into Vista SP1, butseveral interesting features have been lost in the shuffle. One such feature is the enabling of support for what Microsoft terms "hotpatching".
Hotpatching is a process in which Windows components are updated while still in use by a running process. As you can imagine, this handy feature eliminates the need to reboot, maximizing system uptime and minimizing user headaches. According to Microsoft, hotpatch-enabled update packages are installed in a similar manner to standard update packages. The company's description of the feature seems to suggest that this ability is inherent to Vista, but was previously disabled or incomplete.
Hotpatching is something that system administrators will love when faced with a slate of PCs in need of reformatting and restoration to a usable state.Not everyone has the capability or the inclination to create a custom slipstreamed image, and we know that the number of reboots required to get a basic Windows system properly patched up with some productivity software can be frustrating at times.
There are a host of other improvements with SP1, all of which should also help make your Windows Update experience more rock-solid than it already is:
Improved patch deployment by retrying failed updates in cases where multiple updates are pending and the failure of one update causes other updates to fail as well. Optimizing OS installers so that they are run only when required during patch installation. Having fewer installers operating results in a more robust and reliable installation. Improves robustness during the patch installation by being resilient to transient errors such as sharing violations or access violations. Improves robustness of transient failures during the disk cleanup of old OS files after install. Improves overall install time for updates by optimizing the query for installed OS updates. Improves the uninstallation experience for OS updates by improving the uninstallation routines in custom OS installation code. Improves reliability of OS updates by making them more resilient to unexpected interruptions, such as power failure.
TechNet has a handy list of the notable changes in the SP1 release candidate if you're looking for more dirt on what Vista SP1 will bring to the table.
To understand this post, you're going to need to know a little bit about gene regulation. The location on the DNA where the a gene's messenger RNA starts is called a promoter—it contains binding sites for proteins that help start the RNA-making process. For many genes, that's all that's needed. But for those with complex regulatory control—and that includes most of the genes involved in embryonic development—other sequences are needed to ensure that a promoter is only active in the right tissues at the right time. These sequences, called enhancers, can be more or less anywhere, even hundreds of kilobases distant from the promoter they regulate.
This raises a couple of obvious questions: how do they ever find the promoter? If they can work at such large distances, why don't the enhancers just activate all the genes nearby? To give you a sense of the scale of the problem, I'll draw an analogy based on the Bithorax gene complex that's the subject of a paper from the most recent issue of Development. Two genes and an enhancer (among other things) reside in a region that's the DNA equivalent of over 100 miles long. The promoters of the genes are only about 100 feet long. Somehow, the enhancer not only finds a promoter to regulate, but it find the right one.
The new paper helps describe how this happens. It turns out that the promoter has a sequence just next to it that helps specifically attract the enhancer to that promoter—the researchers called it a tether. Delete the tether, and the enhancer will regulate whichever gene is closest. Add the tether to an unrelated gene, and the enhancer will regulate that. You can even stick other genes or DNA insulating sequences between the enhancer and its tether, and the enhancer will ignore them all and work with the promoter next to the tether. To draw another analogy, the tether acts like a postal code to help the enhancer find the right neighborhood.
The authors cite an earlier paper that found something similar in a completely different complex of genes and regulatory elements, suggesting that tethering represents a general mechanism for gene regulation. With dozens of fly genomes now available, it's possible that a few more examples of this will be enough to let bioinformatics gurus fish out tethering sequences, so that the biochemists can tell us how they work.
Development, 2007. DOI: 10.1242/dev.010744
Anyone looking forward to Toshiba's 30" OLED (Organic Light Emitting Diode) displays in 2009 or 2010 is either going to have to wait a little longer or find a different manufacturer. Toshiba announced this week that it was shelving plans to build large OLED displays because the current and short-term costs of mass production are too high to create a commercially viable product. It's surprising that a company of Toshiba's size would back away from OLED-based television manufacturing. OLED displays, after all, have been touted as the Next Big Thing™ as far back as 2001.
OLEDs have a number of advantages over current LCD technology. Unlike LCDs, they do not require a backlight to function, which allows them to draw much less power while active. OLEDs are potentially more efficient to manufacture than LCDs, can be printed on flexible substrates, have a much better viewing angle than modern LCDs, and display better and more realistic color. Furthermore, OLED displays are also faster than their LCD counterparts. All of these improvements, however, come with a cost.
To date, OLED production has been considerably slower than what was initially forecast. Manufacturers have wrestled with display lifetime for years. Originally, OLED displays had a lifespan of only 5,000 hours compared to an LCD's lifetime of 60,000 hours. This has slowly changed over time—manufacturers now estimate they can build OLED screens that equal or exceed the lifetime of a standard LCD—but these sorts of issues have pushed mass-market introduction of large OLED displays ever further into the future.
OLEDs have had some success in very small displays (think cell phones, MP3 players, etc.), but no sizable screens have been produced until this year. As the EETimes reports, Sony has launched an 11" OLED display in Japan—but production costs have forced the company to build just 2,000 of these displays per month. At around $1,700 for an 11" screen, this isn't exactly the kind of display anyone would pick up for movie watching, either.
Based on the current speed of OLED display development, Toshiba's decision to put 30" screens on hold appears to be a sound one. Even if 30" OLED displays become available in the next two years, they're likely to carry a hefty price tag and face a slow market ramp-up. It's reasonable to expect both LCD and plasma displays to improve (and costs to decline) between now and 2010, which will put OLED displays at a further market disadvantage. That's not to say that OLED panels won't be able to compete down the line, but that looks to be the proverbial Three to Five Years Away.
Companies like Sony and Samsung have reaffirmed their intent to push forward with OLED screen production despite the cost; it wouldn't be surprising if Toshiba is scaling back its own plans in this area to cut expenses while staying current enough to take advantage of eventual price drops.
The battle in the Senate over how to amend the Foreign Intelligence Surveillance Act (FISA) begins on Monday when, over the objections of prominent Democrats, Majority Leader Harry Reid will introduce the White House–supported version of a reform bill approved in October by the Senate Select Committee on Intelligence.
This past Tuesday, fourteen Democratic senators—including presidential contenders Hillary Clinton, Barack Obama, Joseph Biden, and Chris Dodd—signed a letter urging Reid to instead bring to the floor an alternative bill produced in November by the Judiciary Committee. That version of the legislation, which will now be offered as an amendment to the Intelligence Committee's bill, contained a variety of additional restrictions and checks on government wiretaps sought by civil liberties groups. It also, crucially, omitted a provision granting telecom companies retroactive immunity from lawsuits related to their cooperation with the president's extrajudicial eavesdropping program. President Bush has pledged to veto any FISA amendment that failed to provide such immunity—a threat that did not deter the House from passing just such a bill last month. Meanwhile, Senator Dodd, whose attempt to place a "hold" on the Intelligence Committee bill was overridden by Reid, is pledging to filibuster any legislation that does include retroactive immunity.
While Democrats have struggled to counteract a frustrated base's perception of congressional capitulation to the White House, the executive branch has mounted a full court press in favor of its preferred version of the law. In a Los Angeles Times op-ed on Wednesday, Attorney General Michael Mukasey warned that the changes made in the Judiciary Committee's version of the bill "would have the collective effect of weakening the government's ability to effectively surveil intelligence targets abroad." And on Thursday, Mukasey and Director of National Intelligence Mike McConnell made their case directly to the Senate in a closed-door briefing.
With divisions sharp, various attempts to split the difference between the alternatives have fallen flat. Reid had earlier sought, under the Senate's Rule Fourteen, to offer a pair of his own bills mixing and matching provisions from the two committees, a solution that appears to have pleased nobody. And on Thursday, the Judiciary Committee rejected a proposal by Senator Arlen Spector to allow lawsuits against the telecom companies to go forward, but with the government substituted as the defendant. (The groups bringing the suits worry that the government would be able to invoke legal defenses, such as executive privilege and sovereign immunity, that are unavailable to private telecom providers.) Michelle Richardson, a legislative consultant for the American Civil Liberties Union, hopes that this may be a strategic blunder on the administration's part. "A lot of people would probably support giving the government broader authority if they would decouple that issue from the immunity question," says Richardson, "so they're probably shooting themselves in the foot by forcing it to go forward like this."
The current wrangling continues a debate that began this summer with the hasty passage of the Protect America Act in response to a ruling by the FISA court—a ruling which the court has declined to release, but which is purported to have required intelligence agencies to acquire warrants when wiretapping conversations between foreign parties that were routed (and recorded) through US telecom switches. Eavesdropping on purely foreign communications had previously been unrestricted—primarily because, traditionally, the physical tap on foreign-to-foreign calls had occurred overseas, outside US jurisdiction. But the Protect America Act, which is due to expire in February, went beyond merely closing this "intelligence gap" and authorized a broad program of surveillance, under minimal court oversight, that permits Americans' conversations with foreigners to be collected, so long as the American party to the communication was not "targeted" by an investigation. The bills now under consideration seek to establish a more permanent solution: the Intelligence Committee version of the FISA Amendment would remain in effect for six years, while the Judiciary Committee version sunsets in four.
While media attention has focused largely on the question of immunity for telecom firms, the additional limitations on surveillance contained in the Judiciary Committee's version of the bill are, arguably, at least as significant. That bill would explicitly bar "bulk" or "vacuum cleaner" surveillance of international telecom traffic that is not directed at a particular person or telephone number. It would require individualized FISA court review whenever the collection of an American's communications became a "significant purpose" of an investigation, whether or not that person was a "target" of the investigation. And it would provide for a congressional audit of past extrajudicial surveillance by the National Security Agency.
A spokesman for Reid says the majority leader hopes to be able to send a bill to conference before Congress adjourns for winter recess, though some observers find this unlikely, and civil liberties groups are anxious to avoid a repetition of the sort of last-minute legislation that produced the Protect America Act. Meanwhile, some civil libertarians are already casting an eye toward the next battle. "We're going to keep fighting to get the important judicial protections in, the immunity out, but if we can't do those things we're going to get as many no votes on the final product as possible," says the ACLU's Richardson. "We don't want the members owning this bill, owning this program, so that when it finally does sunset we can get meaningful changes."
The difficulty and frustration of building GNOME from source is a major impediment for many new contributors. Installing the dependencies, getting the tools working, and compiling major components of the desktop environment is a burden that detracts from time that could be spent making patches. In order to resolve this problem, the developers from rPath have created the GNOME Developer Kit, a complete environment for testing and developing GNOME.
The GNOME Developer Kit is based on the Foresight Linux distribution and includes regularly updated packages built from the latest code in GNOME’s version control system. The Developer Kit is made available as a VMware image as well as an installable ISO. The included package management system can be used to keep the system up to date as changes are made to GNOME during the development cycle.
“Once you download it, you will easily be able to update it every day with PackageKit or Conary, so no need to download new versions,” said rPath’s Ken VanDine in a blog entry. “There will be new downloads available regularly, probably daily, so when you download it, it will be ready to go immediately, without waiting for additional updates. The best of both worlds!”
In addition to providing a complete GNOME desktop environment, the GNOME Developer Kit also includes some experimental components that are under active development or being considered for inclusion in GNOME. For instance, Empathy, PulseAudio, PolicyKit, and PackageKit are all included by default. Although I like a lot of the nice things that are included in the GNOME Developer Kit, there are still a few other extras that I’d like to see added. In particular, I think it would be really nice if it included Mono and MonoDevelop from the latest sources.
Happy Friday, er, again! For those of you participating in Consumermas, remember: you only have a week and a few days left to shop. But check out these links while you continue to put off buying your mom some new tupperware:
Walt Disney World's "Spaceship Earth" ride in Epcot is a fond memory of many of our childhoods. Epcot recently added a tribute to Apple's beginnings to the ride with Steve, but now we're not so sure… well, which Steve it is. Originally thought to be Steve Jobs, Gizmodo calls that theory into question, saying that perhaps the Steve represented in plastic form is Woz instead.Apple has applied for a patent on a way to detect free-fall in electronics, which can be particularly helpful when protecting data stored on devices with moving parts (such as hard drives). This could eventually lead to better accident protection on devices like iPods (classic, of course) and laptops.If any of you were still stuck using Virtual PC, which is now owned by Microsoft, then you should give up any hope of using it in Leopard. The Mac BU confirmed to The Mac Observer this week that Virtual PC would not be coming to Apple's new OS—the last remaining version is Tiger-only.Speaking of Microsoft, Office 2008 is done and ready to go. For realz this time. The final build has been released to manufacturing (RTM) this week, putting it right on track to launch on January 15 at Macworld.Apple was named in a defamation suit against cable TV network BET this week. BET had aired a photo of Chicago gang leader Larry Hoover along with Houston residents James Prince and Thomas Randle while implying that the three were murderers. The piece eventually made its way to the iTunes Store, and
history a lawsuit was made.During last week's opening of the 14th Street Apple Store in New York, senior VP of retail Ron Johnson said that the company planned to open 40 more stores in 2008.
Now off to your endless string of holiday parties. Have some extra-spiked egg nog for us.
The prospect of having broadband while flying at 35,000 feet is enough to get most of us geeks a-twitter. We'd want it to be safe, of course, but frankly we're a little tired of the arguments against unencumbered in-flight WiFi that center on so-called 'Net etiquette. An AP report today recounts the objections of some travelers and airlines alike to in-flight WiFi, and they pull out the usual ghosts and goblins: armies of loud VoIP users, people "flaming" each other on planes (oh please), and guys who just can't stop looking at porn. These Chicken Little scenarios are tiresome and not supported by the evidence, yet they are being used by many airlines to justify blocking and filtering in-flight broadband. Some airlines will filter VoIP packets, others will use blacklists to block access to sites. (And meanwhile, people are still bringing "carry-ons" onto the plane that are two-times too large.)
We only need to look at current parallels to see that those fears have little basis in reality. Consider the porn fear. Laptops on airplanes aren't exactly a new phenomenon—people have been taking them out to catch up on work, watch movies, play games, and otherwise screw around regularly for over a decade now. How often do we see someone on a plane pulling up and browsing his porn collection? Not very often (I, personally, have never seen such a thing). This nonexistent problem isn't going to be exacerbated by the presence of an Internet connection.
Another major concern appears to revolve around passengers making regular—and loud—phone calls through VoIP software like Skype. Although many airlines have offered those seat-back, in-flight phones for some time now, few people actually use them due to the high cost associated with them. Or do those passengers also avoid them because, well, no one wants to make a phone call while on a plane? It could be a mixture of both—I know that I'm not all that tempted to make phone calls while in-flight, except maybe to tell someone that I'll be late. Still, for those who do want to make calls, flights are loud and people are already chatting with each other. A few more people chatting into a headset instead of the child next to them isn't going to make much of a difference that can't be blocked out by a decent set of headphones.
Nevertheless, some airlines are planning on blocking users from receiving calls in-flight, while others want to block VoIP entirely. Strangely, these fears didn't stop the airlines from trying to make money off of their proprietary phone services.
The AP brings up a few other concerns—what if the person in front of you wants to recline! Or, *gasp* what if the person next to you keeps peering over to see what you're doing? Again, these are problems that people already have while using laptops on airplanes. Sure, it's annoying, but certainly not worth banning or restricting in-flight broadband.
We could go on and on, but just consider a parallel example that has essentially changed our casual laptop travels overnight: the coffee house. Freely-available WiFi, combined with still-falling laptop prices, have caused the presence of laptop-users in coffee houses to explode over the last five years. At the right time of day, Panera and Starbucks look more like offices than restaurants or cafes.
The coffee house laptop crowd consists of all types, too. Gamers, business people, students… all people who would rather just do their thing than disrupt everyone with blaring porn or a loud Skype session. In fact, the presence of laptop users have brought an eerie silence to coffee houses—the last time I was at a Starbucks (armed with my MacBook and some headphones, of course), it struck me how oddly quiet everything really was, aside from the cheesy store music and the whir of the baristas doing their thing. People are generally very sensitive to those around them and tend to avoid viewing anything that might offend someone walking by, too. Something tells me that in-flight broadband will yield largely the same results.
Thankfully, some seem to realize the same thing. "We think decency and good sense and normal behavior" will prevail when it comes to in-flight 'Net etiquette, Aircell CEO Jack Blumenstein told the AP. "I'd rather have the responsibility in the hands of passengers and require them to be accountable for what they do on laptops and airplanes," Harvard Law School professor John Palfrey added.
We tend to agree. Bring on the in-flight broadband and let the complainers take the bus.