NVIDIA's new nForce 700 series chipset hit the market today with several new features and improved Penryn support. The 780i is meant to go head-to-head with Intel's X38, and matches up well against that chipset on paper; both chipsets offer 32 lanes of PCIe 2.0 support, eight-channel HD audio, and a huge number of USB 2 ports (12 for the X38, 10 for the 780i). Although the 780i only officially supports DDR2-800, Tech Report's chipset review states that NVIDIA's memory controller is actually capable of running at DDR-1200 speeds. As an added bonus, the 780i supports 3-way SLI configurations, though there may be a few caveats attached to performance scaling, as we'll discuss.
In terms of its actual design, however, the nForce 700 series looks like a bit of a kludge. The chipset's PCIe 2.0 support is delivered by the new nForce 200 bridge chip, which sits between the 780i SPP and the PCIe 2.0 slots. While the nForce 200 chip provides a full 32GBps of bandwidth to its two PCI-Express 2.0 x16 lanes, however, it's connected to the 780i SPP via a single x16 lane for a total of 14.4GBps of bandwidth. According to NVIDIA, this bandwidth discrepency is not an issue, thanks to the nForce 200's ability to handle GPU-to-GPU communication without transferring data back to the north bridge.
Drop in a third video card, and things get more complicated. The third PCIe slot on the 780i chipset only supports a standard PCI-Express interconnect, and hangs off the 780MCP, rather than the nForce 200. This means that the third card in a 780i SLI rig is restricted to half the bandwidth of the other two cards. The third card will also take a latency hit compared to the first two, given that all communication must be passed across the MCP, SPP, and nForce 200 before returning via the same path. At this point, NVIDIA's tri-SLI looks less like a genuine feature, and more like a capability NVIDIA included to please the marketing department.
According to Tech Report's review, the 780i SLI SPP may actually be a 680i SLI SPP validated to maintain a higher interconnect speed, while the 780i SLI MCP is actually the same nForce 570 south bridge that NVIDIA's been relying on for the past 18 months or so. I recommend you read TR's coverage for more information, but from my perspective, the 780i SLI is a solution that gets the job done, but doesn't offer anything compelling against the X38 (with the obvious exception of SLI support). NVIDIA's Penryn compatibility issues were obviously enough reason for the company to push a chipset refresh out the door, but at this point I'm more interested in what the Next Big Thing from NVIDIA might be—the 780i seems to be an attempt to shoe-horn certain capabilities into an older platform, rather than a genuine attempt to launch something new.
When you drop your PC off at Circuit City for a hardware upgrade (and you do use Circuit City for all your hardware upgrades, don't you?), you probably don't expect the techs to rummage around your hard drive, dredging up "questionable" files and showing them to law enforcement. But that's exactly what happened to Kenneth Sodomsky in 2004 at a Pennsylvania Circuit City. Now, the Superior Court of Pennsylvania hasconcluded that the child porn found on Sodomsky's computer by Circuit City techs could in fact be used against him at trial.
The video in question, a clip of "the lower torso of an unclothed male," was discovered on Sodomsky's computer when he took it in for the installation of a new DVD burner. After installing the drive, Circuit City techs routinely install any included software and then test the whole arrangement to see if it works.
When doing said testing, the tech in question pulled up Windows XP's search dialog and scanned the whole hard drive for video files. As the list populated, the tech noted that "some of the files appeared to be pornographic in nature" due to theirfilenames, with some names indicating that they showed 13- or 14-year old boys. The tech clicked the first one, and immediately stoppedthe videowhen a hand appeared in the frame, moving toward the unclothed boy.
Cops were called, an arrest was made, Sodomsky was charged. His "life was over," he said. But in the court case, he argued that the techs had violated his privacy illegally by looking at the files and that the evidence could not be used against him. A court initially agreed, but a recent appeal to the state Superior Court has now overturned that decision.
The Superior Court opinion, a copy of which was seen by Ars (and first spotted by Cnet), decrees that Sodomsky didn't have an expectation of privacy in this particular case because he had granted the techs permission to install the drive and test it. Because they were only attempting to test the drive and ran only a general search for video files, what they found was permissible.
"We also find it critical to our analysis that when the child pornography was discovered," said the court, "the Circuit City employees were testing the DVD drive's operability in a commercially-accepted manner rather than conducting a search for illicit terms."
The case has been sent back to a lower court, which will now consider the evidence in the case against Sodomsky.
Unanswered by the opinion is the question of what, exactly, the Circuit City techs thought they were testing. According the court, "the playing of videos already in the computer was a manner of ensuring that the burner was functioning properly." Note that the techs weren't burning a disc, nor reading from a disc; they were playing a video file from the hard drive. How this proves a DVD burner is installed correctly is unclear.
The opinion does provide some guidelines for thinking about search-and-seizure issues and privacy concerns, though. Not only did Sodomsky surrender at least a bit of his privacy when he asked the techs to work on his machine, but the police who arrived and viewed the video in question didn't violate any rules. Because Circuit City employees invited them into the repair area and showed them the clip, the police didn't violate the Fourth Amendment.
This weekend saw the conclusion of the United Nations Framework Convention on Climate Change in Bali, Indonesia, a day later than planned. The summit, designed to put in place a global plan to tackle climate change once the Kyoto Protocol expires in 2012 was highly fractious, and intransigence by a number of nations resulted in the threat of a failure to come to an agreement. At the last minute, however, a certain measure of consensus was achieved, and an action plan was approved by the member nations.
As we've reported before, the summit's aim was to take over from the Kyoto Protocol once it expires in 2012, in light of the reams of data contained within the Fourth Assessment Report (4AR), the recent four-part study concluded by the Intergovernmental Panel on Climate Change (IPCC). The 4AR has painted a much bleaker picture of the world's climate than before, with all signs pointing to an accelerated degree of disruption to the world's climate than previously agreed upon. Some of these changes are now underway, but there is still a window for action, albeit short, to prevent some of the worst effects.
Running over its schedule by a day, the Bali conference agreed on a roadmap on Saturday that puts in place a two-year process to attempt to agree on widespread reductions on anthropogenic climate emissions. Two days earlier, the European Union was highly critical of the US' continued intransigence on the issue. The mood could be summed up by an impassioned plea from the Papua New Guinea's representative, telling the US, "If you're not willing to lead, get out of the way." The plea was effective, as the US agreed to support the roadmap
However, although an agreement was arrived at, one has to question its worth. The US opposition centered around the the EU and China's proposal for a reduction in emissions to 25-40 percent below 1990 levels by the developed nations, and a lack of any concrete demands on the developing world. As a result, the EU's targets have been omitted to be replaced by a commitment to "deep cuts," and the US is already seen by many to be backtracking on the plan.
The failure to address the developing world's emissions in Kyoto has been used by politicians of all flavors in the US over the past 16 years to stall any meaningful action on the national level, and it's hard to envision the response to Bali being much different. New York's Mayor, Michael Bloomberg, spoke to a fringe meeting and pointed to the problem: "…Congress. They're unwilling to face any issue that has costs or antagonises any group of voters," and this surely will.
In light of this, one ought to greet the news of a signed agreement with rather cautious optimism. The need to act, and act quickly, is paramount, and in the immortal words of 24's Jack Bauer, "We're running out of time." But actual implementation will not be easy. Although the current crop of Democratic Presidential hopefuls all see tackling climate change as a priority, unless Congress can be persuaded to take action, those promises will be meaningless.
Ticketmaster, whose onerous fees are all but impossible to avoid if you want to see a live event, has inked an agreement with the National Football League to handle a new ticket resale program for the league. Dubbed the "NFL Ticket Exchange," the service will be operated and hosted by Ticketmaster, and will also be accessible from NFL.com. Ticketmaster and the NFL say that the service, which will launch in time for the 2008 season, will allow ticket holders to unload their tickets in a "secure and reliable way."
Ticketmaster currently runs exchange sites for 18 of the NFL's 32 teams. On team sites, tickets are sold at face value, with the company taking a cut at each end of the transaction (i.e., convenience fee, processing fee, printing fee, fee calculation fee). It's an attractive alternative to scalpers for those looking to get seats to an enticing match-up, but those who are looking to sell their tickets will certainly be tempted by the higher prices they can get from a site like StubHub, eBay, or Craigslist.
It's not known whether season ticket holders, who purchase the vast majority of tickets in the NFL, will be forced to use the NFL Ticket Exchange. Some of the teams that currently run similar sites prohibit season ticket holders from reselling their ducats—much to the chagrin of those who have paid thousands of dollars for personal seat licenses on top of the tickets.
One team has even gone to court in an attempt to hunt down and punish season ticket holders who have resold their tickets at a profit. This past October, the still-perfect New England Patriots obtained a court order forcing StubHub to cough up the names of team's ticket holders who have sold tickets on the site. Superior Court Judge Allan van Gestel wrote that the team's desire to be good corporate citizens and report "to authorities those customers that they deem to be in violation of the Massachusetts antiscalping law" was a factor in his ruling.
That said, it's disingenuous to suggest that the Patriots—and other teams—are trying to take the high moral ground in the fight against scalpers. Yes, teams have an interest in making sure fans can afford to attend a game without taking out a second mortgage (although said mortgage will certainly come in handy when it's time to buy beer, hot dogs, a program, and a souvenir jersey).
What the teams are really after is complete control of the ticketing pie. Teams haven't historically profited from ticket resales, and in a market where teams are looking to extract every last bit of revenue possible, selling the same tickets twice looks awfully attractive. The Chicago Cubs have been doing it since 2002, allowing ticket holders to resell their passes on the team's site in exchange for a cut of the profits.
A decade ago, services like the NFL Ticket Exchange and StubHub were not possible, but the Internet and the way it breaks down market barriers has made ticket resale services a profitable market. Depending on the ticketing technology used, it's theoretically possible to track a single ticket from the point of issue to the stadium gate—especially in the case of e-tickets. That presents an opportunity for sports leagues and Ticketmaster—and a threat to resellers like StubHub.
How well the NFL Ticket Exchange is received will depend on a single factor: whether or not the league tries to make it the sole outlet for resale. With some teams having waiting lists in the tens of thousands for season tickets, the threat of losing the coveted tix will certainly steer many season ticket owners toward the league-sanctioned site. On the other hand, the dreaded Ticketmaster Tax might not make scalpers look that bad.
The software development kit for Google's Linux-based Android mobile phone operating system has been out in the wild for a over a month now, plenty of time for developers to form opinions of the platform and assess the capabilities of the API. The verdict from seasoned mobile software programmers is somewhat mixed; some are even expressing serious frustration.
I put Android to the test myself in an attempt to see how bad the situation really is. What I discovered is a highly promising foundation that isplagued by transitional challenges and a development process that needs more work. Android has manybugs, some of which are impeding development. Unfortunately, Google's QA infrastructure for the platform is completely inadequate for a project of Android's scope and magnitude.
There is no public issue-tracking system for Android. Instead, users post information about the bugs they encounter in the Android Developer Google group and hope that one of Google's programmers sees it so that it can be added to Google's private, internal issue-tracking system. Users have no way to track the status of bugs that they have reported, and they never know whetherthe issue is being addressed at all until after it is resolved, at which point it is mentioned in the release notes for a new SDK release.
"Unfortunately there is currently no externally-accessible issue-tracking system," wrote Google developer Dan Morrill in response to complaints about the bug reporting situation. "We are considering how we might implement such a system, but we don't have an answer yet. The biggest snag is simply keeping our internal issue tracker in sync with an external one. So, it's a process problem, rather than a technical problem."
Companies like Skype, Nokia, and Trolltech all have public issue tracking systems for their software, so one has to wonder why Google, with all of its resources, can't do the same for Android. This is a pretty clear symptom of a dysfunctional development process. In an effort to minimize the frustration of not having centralized issue tracking, users have started to independently catalog known bugs at an unofficial Android wiki.
Another major problem with Android is lack of documentation. The API reference material doesn't provide enough information and one sometimes has to experiment (that is,guess) to figure out what the parameters for various methods actually do. In many cases, I found the developer discussion group to be far more informative than the API documentation. I also grew frustrated with some of the inconsistencies in the API naming conventions, an issue that other developers have complained about as well.
Working with the layout model for the Android user interface can also be frustrating. The code samples mostly emphasize the XML-based user interface description language, so there aren't enough examples that demonstrate programmatic layout techniques.
A recent article in the Wall Street Journal illuminates problems encountered by other developers who are attempting to build applications for Android. "Functionality is not there, is poorly documented or just doesn't work," MergeLab mobile startup founder Adam MacBeth told the WSJ. "It's clearly not ready for prime time."
The strength of an android
Although Android has its share of problems, there are some places where it really shines. The Eclipse plug-in isquite effective and provides excellent integration with the Android emulator. Every time you initiate the debugging process in Eclipse, it will start up your program inside the emulator and connect it to the Eclipse debugger automatically. You don't even have to close the emulator between tests; you can just modify the code and run the debug process again and it will restart your program in the currently running emulator instance. The seamless support for breakpoint debugging is so effective that it feels like developing a regular desktop application.
The setup is also surprisingly easy if you already have Eclipse installed. You just unzip the SDK, install the Eclipse plug-in, tell it where you put the SDK, and you are good to go. Downloading the SDK to compiling my first Hello World programtook me less than ten minutes. In that respect, Android provides a much better experience than Maemo, for instance, which is a real pain to set up.
There are a few other places where Android surprised me with goodness. The ScrollView widget appears to support kinetic scrolling right out of the box, which means that developers won't have to implement that at the application level like they do with Maemo. I was also impressed by Android's seamless support for screen rotation and multiple resolutions. The emulator comes with several skins that you can use to test your application at various screen sizes and in different orientations. My applications worked fine in both horizontal and vertical orientations without requiring any custom programming. The user interface changed to accommodate horizontal orientation in much the same way a regular desktop application changes when it is resized.
To test out the API, I wrote a few experimental Android programs, including a Twitter client. The API is moderately conducive to rapid application development, but there are still some gaps. Although the API offers a lot of really nice functionality for animated transitions, alpha transparency, and other similar visual effects, it doesn't make it easy to create applications that have a really polished look and feel. For my Twitter application, for instance, I wanted to put a nice picture in the background and have a transparent, rounded rectangle with a border behind each tweet, but those kinds of embellishments end up being way more trouble than they are worth in Android. By comparison, getting the same effect with XUL only requires a few trivial lines of CSS.
I had a hard enough time getting the basic layout to look right even without thinking about embellishments. Android would benefit greatly from a drag-and-drop design utility that provides an interactive approach to layout and exposes all of the widget attributes in aclear and expressive way.
Despite some of the bugs and limitations in the API, it is definitely a viable and effective platform for application development. My Twitter client was only about 130 lines of code, which isimpressive. That said, I could stillcrank out applications faster with Java FX Script. In general, I think that Java FX Mobile is probably the competing platform that is most analogous to Android.
The inevitable comparisons between Android and the iPhone platform seem a bit misguided now that I've really worked with Android. The iPhone platform seems to be tailored to a very specific kind of user experience that is particular to the hardware. Apple has always been good at leveraging the tight coupling between its hardware and software, and the iPhone is no exception. With the iPhone, Apple has sacrificed the potential for hardware diversity but gained in the process the ability to make innovative technologies like multitouch a ubiquitous part of the user experience. Android, on the other hand, has to be designed from the ground up to support an extremely diverse range of hardware devices with vastly different capabilities.
Android's design seems to be heavily focused on making as few assumptions as possible about the kinds of devices on which the software will run. And seriously, some of those devices are going to be monstrously ugly clunkers compared to the iPhone.
It's important to remember that Android is still in early stages of development and that its present weaknesses aren't indicative of failure. Devices with Android won't even start to hit the market until later next year, so this is like a pre-release aimed at spurring early development so that a healthy ecosystem of third-party software applications is available at launch.
Despite pre-release status, some of Android's weaknesses areindefensible. Google's Android teamneeds to get its act together and figure out how to interact with a rapidly growing community of professional and enthusiast developers. The "release early and often" strategy is generally a good thing, but it utterly fails when infrastructure isn't in place to facilitate proper handling of user feedback. Google has a habit of embracing the early release philosophy with a little too much enthusiasm, and the current situation with Android is emblematic of that approach.
As if we needed another reason to be cautious about how much information we post on social networking sites like Facebook, a recent lawsuit over data mining reminds us once again that nothing is entirely private. Facebook filed a lawsuit earlier this year against a number of anonymous individuals responsible for hitting Facebook's servers hundreds of thousands of times in an effort to scrape information on users of the site. It has since discovered the identities of a handful of those people this month, some of whom are associated with a Canadian company that pays for affiliate referrals to porn sites.
The complaint, originally filed in June in the US District Court for the Northern District of California, says that a certain IP address attempted to access Facebook's system to harvest information between June 1 and 15 of this year. The attempts were unauthorized, says Facebook, and generated error messages. This did not stop the defendants from trying some 200,000 times, though, which caused Facebook to eventually block the IP.
More IP addresses quickly picked up where the old one had left off, however, and Facebook claims that the whole incident has cost the company over $5,000 in order to investigate the matter. By filing discovery requests with the associated ISPs, the company was able to identify a number of individuals associated with the IPs that were pillaging its servers. Brian Fabian, Josh Raskin, and Ming Wu were all fingered by the ISPs, in addition to Istra Holdings, a company responsible for SlickCash.com. Istra Holdings was listed as the owner of one of the IPs in question, and Fabian was listed as the "Manager" contact for that company.
Facebook charges that the defendants—several of whom are still anonymous John Does—violated the Computer Fraud and Abuse Act by recklessly attempting to access Facebook hundreds of thousands of times. They also allegedly violated the California Comprehensive Computer Data Access and Fraud Act and breached the Terms of Service set forth by the service that the defendants agreed to upon signing up. "The Defendants' breach of the Terms of Service have caused and continue to cause Facebook to expend resources to investigate the attempted unauthorized access and abuse of its computer network and to prevent such access or abuse from occurring," reads the amended complaint.
Facebook never indicates whether or not any information was accessed—it sounds as if it wasn't—but Istra and its employees could have been using the site to harvest e-mail addresses for "marketing" purposes (or worse, collecting more detailed identity information for other malicious reasons). Whatever the reason, Facebook wants the court to put an end to it and has asked for injunctive relief, in addition to a trial and unspecified damages.
Ever since the launch of the National Do Not Call registry, the American dinner hour has been quieter. Well, mostly. There have been a few violators here, but overall, the level of telemarketing calls has gone down significantly in recent years thanks to the registry. Both the House and Senate have passed bills this month meant to improve upon it, ensuring that the majority of your spam stays off your phone line (and remains in your inbox).
The first of the bills, the Do-Not-Call Improvement Act of 2007 (S 2096), was recently passed by the Senate, and a nearly identical version (HR 3541) was passed by the House earlier this month. As the registry functions today, users must re-register their phone numbers every five years if they want to remain on the list, but the bills change that by eliminating the automatic removal of phone numbers after a certain period of time. Now, registrants will remain on the do-not-call list indefinitely until they request to be removed, or if the number becomes no longer valid, becomes disconnected, or gets reassigned. According to the bill, the FTC can check phone numbers periodically for this, and may remove them from the list if they no longer belong to the person who registered them.
"By enacting this legislation, the Senate has taken an important step toward making the Do-Not-Call list the Never-Call list," said Senator Ted Stevens (R-AK), cosponsor of the Senate version of the bill, in a statement. Thanks, Ted.
Not having to constantly re-register is a convenient change for us at home, but that's not all Congress has in store. The Senate also passed the Do-Not-Call Registry Fee Extension Act of 2007, which will allow the FTC to continue collecting fees required to operate the registry (money, always helpful in getting things done).
As it stands now, the FTC only has the authority to collect operational fees through the end of 2007. Instead, the bill will permanently extend the FTC's ability to collect the fees, which come from telemarketing companies that are required, by law, to keep up-to-date lists on phone numbers that they cannot call. Under the new legislation, the Congressional Budget Office estimates (PDF) that the FTC will collect some $107 million over the next five years, which is a couple million per year more than the FTC makes currently (roughly $19 million was collected in 2006, for example).
Those fees are apparently being put to work in going after violators, too. Last month, the FTC slapped several companies, ranging from Craftmatic to DT, with $7.7 million in penalties for not following the requirements of the registry. The settlement with Craftmatic and its subsidiaries was the second-largest in history for Do-Not-Call violations—the largest was a settlement with DirectTV in 2005 for $5.3 million.
So here's to many more years of call-free dinners! Now, if only someone could invent a Do-Not-Call registry for annoying family members.
The results are now in from a thorough, $1.9 million test of the voting machines that Ohio has used in elections over the past few years, and they paint about as awful a picture of the state's electoral apparatus as one would expect given the stead stream of grim news out of counties like Cuyahoga. The two private-sector and three academic research teams that carried out the Evaluation & Validation of Election-Related Equipment, Standards & Testing (EVEREST) study of Ohio's e=voting systems did not mince words in the 86-page Executive report that they released this past Friday (or, if words were minced, then one can imagine that the unminced version wasn't family-friendly): "The findings of the various scientists engaged by Project EVEREST are disturbing. These findings do not lend themselves to sustained or increased confidence in Ohio's voting systems."
Ohio Secretary of State Jennifer Brunner, a woman whose recent and spectacular bungling of a Cuyahoga County recount gives ample reason to doubt her commitment to fair and accurate elections, didn't even bother trying to sugarcoat this report.
"To put it in every-day terms, the tools needed to compromise an accurate vote count could be as simple as tampering with the paper audit trail connector or using a magnet and a personal digital assistant," Brunner said in a statement. Note that Brunner here is describing machines that have been in use in Ohio since before the 2004 presidential election. This isn't some glimpse of how bad things might be in November 2008. It's a look at how bad they've been all along.
Brunner went on to make the following unintentionally funny remark, which was presumably intended to inject a note of confidence into the release of a report that could almost have been titled, Barn Door Left Open; Whereabouts of Horse In Doubt: "It's a testament to our state's boards of elections officials that elections on the new HAVA mandated voting systems have gone as smoothly as they have in light of these findings."
E-voting in Ohio has gone "smoothly"? Really?!
Speaking of damage control attempts, however feeble, Premier released this press statement in response to Friday's report that contains plenty to chuckle at. I thought this gem was particularly priceless:
"It is important to note that there has not been a single documented case of a successful attack against an electronic voting system, in Ohio or anywhere in the United States."
Given the magnitude of the vulnerabilities that the report details in Premier's systems and the impossibility of conducting a meaningful audit with those systems, this is sort of like a blind and deaf person saying, "Despite my habit of cleaning my first-floor apartment in the nude with all of the street-facing windows open, I have no documented evidence that anyone has ever seen me naked."
Almost 1,000 pages of bad news
The voting systems investigated in the study came from ES&S, Hart Intercivic, and Premier Election Systems (formerly Diebold). The researchers evaluated individual components, whole systems, and elections procedures, and the list of detailed reports on each vendor's systems that they produced described technical and procedural problems with almost every aspect of each system. Like so many of their kind that litter my hard drive after years of e-voting coverage, the EVEREST reports list of page after page of flaws, vulnerabilities, and bone-headed design decisions, many of which would boggle my mind were it not already completely boggled out on this topic by said prior coverage.
Ultimately, the voting systems got failing grades in the following main areas tested, according to the "Findings" section of the executive report:
Insufficient Security: The voting systems uniformly "failed
to adequately address important threats against election data and
processes," including a "failure to adequately defend an election from
insiders, to prevent virally infected software… and to ensure cast
votes are appropriately protected and accurately counted." Security
Technology: The voting systems allow the "pervasive mis-application of
security technology," including failure to follow "standard and
well-known practices for the use of cryptography, key and password
management, and security hardware." Auditing: The voting
systems exhibit "a visible lack of trustworthy auditing capability,"
resulting in difficulty discovering when a security attack occurs or
how to isolate or recover from an attack when detected. Software
Maintenance: The voting systems' software maintenance practices are
"deeply flawed," leading to "fragile software in which exploitable
crashes, lockups, and failures are common in normal use."
The EVEREST executive report's conclusions summarize the findings as follows:
Unfortunately, the findings in this study indicate that the computer-based voting systems in use in Ohio do not meet computer industry security standards and are susceptible to breaches of security that may jeopardize the integrity of the voting process. Such safeguards were neither required by federal regulatory authorities, nor voluntarily applied to their systems by voting machine companies, as these products were certified for use in federal and state elections.
In lieu of my typical bullet list of outrageous report highlights—obvious admin passwords, a complete lack of encryption on critical files, a reliance on easily manipulated "security tape" to prevent tampering, the ease with which anyone can boot some of the machines into admin mode, and other typical problems that were there in spades in this report—I'll just highlight one critical flaw in an optical scan machine of the type that everyone wants to replace the touchscreens with.
The EVEREST researchers described a vulnerability in the ES&S M100 optical scanner in which simply flipping the write-protect switch on the device's CF card to "on" would result in a precinct-wide undercount that's extremely hard to detect.
If this switch is activated after the polls are opened and reset before the polls are closed…the internal counts of the m100, and the paper tape reports will be correct and the system will function normally, but the counts of the votes scanned will not be added to the electronic media delivered to the central Board of Elections… To add to the level of difficulty in detection of the exploit, while the physical ballots are in the ballot box in the correct number and the paper tape shows the correct number, the memory card is delivered to the central Board of Elections where it is read and processed. The current processes in use in most polling places are a simple review of the paper tapes, which would be correct. As such, it is likely that unless close scrutiny or recounts of the precinct were performed that surgical use of this vulnerability would go undetected.
Note that this write-protect switch is apparently easy to flip accidentally.
Obviously, turning on the write-protect for the duration of a whole election would cause that machine's precinct to report "zero" votes cast, thereby tipping off election officials that something was wrong. But if a malicious precinct worker were to just reach down periodically and flip the switch on and off during the course of a day's polling, he or she could easily cause a serious undervote that would only be detected by a hand count of the optical scan ballots.
Of course, the problems with the optical scan machines didn't end there. In an section of one report document that brought back memories of hanging chads for me, the researcher team from a company called Systest reported that the M100 also had serious problems properly recognizing votes on ballots where the ovals were less than fully filled in. "It is possible that clearly indicated votes may not be recognized by the scanner," Systest stated in their report, "and if the election is not configured to warn of undervotes, those votes will be lost. It's also possible that overvotes may not be recognized as such and warned about if made with marks that the scanner does not recognize."
Nonetheless, optical scan to the rescue
In the wake of the report, Brunner is talking about scrapping all of the direct recording electronic machines (DREs, aka "touchscreens") in the state and moving to a system in which Ohio voters manually mark optical scan ballots that are then shipped off to a centralized location for scanning. In order to give this system enough time to work, Brunner is proposing that early voting begin a full fifteen days before the election date, with polling locations open from 7am to 7pm six days a week, and from noon to 7pm on Sundays.
The move to centralize the actual ballot scanning is intended to cut down on the number of points at which attackers could influence the polling using simple tricks like the CF card "write protect" manipulation described above. Unfortunately, it would also have the effect creating fewer points of failure for the entire voting system, so that you'd need fewer bad actors willing to do the CF card trick if you wanted to steal an election. Unless the security at the centralized polling location is extremely tight and the people who are doing the ballot scanning are 100 percent trustworthy, this portion Brunner's plan could make stealing an election even easier.
Even though the long-term plan is to replace all of the DREs in the state with optical scan machines, the report admits that this won't be possible in time for the March 2008 presidential primaries. There is some hope, however, that the new system (such as it is) will be in place for the 2008 presidential election.
A new study by the NPD Group doesn't paint a great picture for the current state of online productivity suites. If the numbers from a survey of 600 US residents are to be believed, most of us have never heard of, let alone tried, products such as Google Docs or Zoho. Considering various factors such as visibility and the industry's untested waters though, these numbers could be due for a significant shift in the coming years.
According to NPD Group numbers, 73 percent of the 600 Americans surveyed have never heard of online office suites, while another 20 percent have, but simply haven't tried any for one reason or another. The remaining six percent of respondents are split between those who have heard of the suites and either haven't used them again, use them infrequently, and use both online suites and desktop apps like Microsoft Office. Based on these numbers, some have already written a eulogy for Web 2.0 office suites, but that assessment might be a bit early.
There are various reasons for the perceived limited success of online office suites, starting with their lack of visibility. As they stand right now, online office suites have a hard time getting in front of users because they aren't offered as boxed software that can adorn retail shelves. Even Google's toolbar that sits above many of its services doesn't highlight the Docs product; users have to click the More button and find Docs among a sea of other Google offerings. To try and overcome this obstacle and snag more consumer attention, however, Zoho plans to mimic successful online offerings like Apple's .Mac package by getting actual retail boxes on store shelves in 2008. The box will be virtually empty, with not much more than some starter documentation that informs customers about the site, helping to get them started with using its various web apps.
Another contributing factor to the online office suite's limited success with consumers so far is the relative infancy of the industry itself. Google Docs, for example, began life as Writely from a startup called Upstartle only in 2005. In that time, the technology behind rich web apps has only improved. Still, online office suites haven't really been around long enough to penetrate the public consciousness, a difficult task in the face of the popularity of industry titan Microsoft Office. And web-based offerings arguably still need some time to mature before they're serious competitors for full-fledged office suites like Office, OpenOffice.org, and even iWork. Google Docs was only officially launched as a customizable product for businesses, education, and consumers in August 2006, with a presentation application added just this past September.
On a larger scale though, the consumers surveyed in the NPD Group's study are likely last on the list of targets for online office suites. Just like Microsoft Office, Google and Zoho are primarily after businesses, educational institutions, and other organizations with which each company has reported strong success so far. Google's list of Apps customers has been steadily growing, including major wins like Procter & Gamble, General Electric Corporation, and Prudential.
In this light, online office suites are enjoying success in their infancy with non-consumer markets so far. As usual, consumer adoption could increase as more users experience the suite in a business setting and have the desire to bring some of those capabilities home. As online office suites mature and gain visibility through efforts like Zoho's retail boxes, home users looking for more collaboration and less feature bloat could soon begin finding the right balance with online office suites, just as they have for years with webmail and picture sharing services.
Ever since Sony released its 1.60 firmware update for the PlayStation 3 that included a [email protected] client, the machine has simply dominated the charts in the distributed computing community. Our own Jon Stokes explained why, and now Sony has released a number of nice updates to their client.
First, for those of us who want tohelp the effortbut don't like the idea of leavingmachines on all night, there is now a timer that allows you to tell the PlayStation 3 to run for a set amount of time and then shut down. The instructions are simple.
Go to Settings menu, select Automatic Shutdown and then After Current. You will see a little clock appearing on the top right hand corner of the screen. When this clock reaches zero, the machine will power off completely so you can sleep quietly through the night. In this option the machine will power off after sending the data back to Stanford University so your contribution to the project is maximized.You may also choose to shut down after a limited period of time (for example, 3 or 4 hours). To do that choose the Automatic Shutdown option and click on the “In 01h:00m” timer. You will now be able to change the timer settings. After rotating the timer to a new time, you should now see your selected timer appear at the top right corner of the screen. This timer will count all the way down to 0 and then power off the system. To disable active timer at any time, go to Settings, choose Automatic Shutdown and then Disable.
You can also now set up some background music to play as you gleefully fold your proteins. Equally easy.
To activate the music player select Music and then a channel. The channels are automatically populated with songs you have stored on your Hard Disk Drive! For holiday music I prefer to choose Dramatic (yeah!) and voila – music is playing.If you wish to skip to another track, just use a combination of X + left or right arrow. You can also change the channel by choosing X + up or down arrow. Happy listening!To stop music from playing, select Music again from the main menu and choose “disable.”Your music playback selection will be saved between consecutive runs of [email protected]
These new updates are great, and they make the PS3 [email protected] Client even easier to use. These efforts help scientists find ways to helpthosewithserious diseases including Parkinsons, Alzheimers, and some cancers. It's great that Sony is making it so easy to help out.
Also be sure, after getting the update, to contribute your cycles to Ars Technica's own Team EggRoll.