One of the incredibly useful things about Mac OS X in general is the potential for integration between applications. Type the name of an Address Book contact in Gus Mueller's VoodooPad and it'll link the name, offering a useful contextual menu. Collect icons in CandyBar? Right click one and you can set it as your iChat avatar with the option of applying any of Leopard's new image effects. And let's not even get started on the power that AppleScript and its mortal-friendly Automator enable for moving and manipulating data between applications.
It is with this integration in mind that some new features in a couple of Mac OS X e-mail clients deserve a highlight, as they're fairly game-changing developments for those who have to work with mail on a regular basis. First is the discovery of Leopard Mail's support of message URLs, explored in-depth by John Gruber at Daring Fireball. Though the new feature is strangely undocumented by Apple, users have discovered that Mail now supports a system of URLs (yes, URLs can do more than point to porn) that allow you to link specific messages in other applications. For example, you could include links to a couple Mail messages from coworkers alongside notes, pictures, and web links in OmniOutliner or Yojimbo documents. This opens up a whole new productivity world, allowing you to bring your e-mail into other applications that aren't specifically designed to integrate with Leopard's Mail.
To help make it easy for users to harvest these message links (as of 10.5.1, Mail doesn't provide an option, and not all applications create the proper Mail message URL from a simple drag and drop yet), Gruber includes the code for a simple AppleScript at the end of his post. Save that script with Script Editor (found in /Applications/AppleScript/) and call it via any number of methods, such as Mac OS X's own AppleScript menubar item, Red Sweater's FastScripts, or launchers like Quicksilver and LaunchBar. The newest Leopard version of indev software's MailTags plug-in for Mail also provides a dedicated menu option for copying a message URL.
If this integration has your productivity gears turning, but Gmail is your client of choice, Mailplane could offer a nice compromise. As a desktop Gmail browser that allows for things like drag-and-drop file attachment and even an iPhoto plug-in for e-mailing photos, Mailplane is more or less a bridge between the convenience of webmail and the integrated power of desktop clients.
New in the most recent private betas of Mailplane (1.55.4 and above) is a similar URL system for Gmail messages which appears to work on both Leopard and Tiger. Complete with an Edit > Copy as Mailplane URL option, this option allows users to paste custom mailplane:// URLs in other applications to bring mail out of Gmail and into their productivity workflows. Remember, though, that Mailplane is still a browser for Gmail, albeit with the aforementioned modifications and other useful things like Growl notifications and support for multiple accounts (including Google Apps). Since it isn't an offline mail client, you'll still need to be online for a Mailplane URL to connect to its corresponding Gmail message.
Still, these new message URL features in two useful Mac e-mail clients will likely see some official integration love from other third-party apps in the near future. Aside from DIY AppleScripts, apps like Yojimbo and TextMate can only benefit from being able to include e-mail in the productivity mix. Knock knock third parties—how's about it?
Google's Chinese business has been consistently questioned and criticized, but now a Chinese company has taken issue simply with its name.
Beijing Guge Sci-Tech is suing Guge, or Google China, claiming that the Internet search giant is tramlping on its good and, perhaps most important, registered Chinese Mandarin business name. Guge Sci-Tech registered its name in 2006, a few months before Google did. Now the tech company wants Google to change the name of its Chinese branch and pay an undisclosed sum to cover all its legal fees.
Beijing Guge Sci-Tech registered its name at the Beijing Municipal Industrial and Commercial Bureau on April 19, 2006, and Google followed with registering "Guge" on November 24 that same year. This similarity in names, Beijing Guge Sci-Tech argues, has confused the public and damaged its business.
However, if Google was considering the use of the word before Beijing Guge Sci-Tech registered, it could work in Google's favor as the company has clearly registered its name in good faith. And, for all we know, the Chinese-based Guge may be little more than a trademark registration or a cybersquat, as information on the company is extremely hard to come by. Google China suggests that Guge Sci-Tech is indeed looking for an easy payout, perhaps having picked up on Google's plans by paying attention to Western media. Everyone knew Google would be changing its name in China, as "Goo-Gol" means "old" or "lame dog."
Also at issue between the companies is the definition of guge, which is not a normal Chinese word. Google says its a combination of Chinese characters that mean "valley" and "song"—a reference to Google's Silicon Valley ties. Beijing Guge Sci-Tech disagrees, stating the word means "a cuckoo singing in the spring, or the sound of grain singing during the harvest autumn time."
At FOSSCamp in October, skilled eye-candy expert Mirco Müller (also known as MacSlow) hosted a session about using OpenGL in GTK to bring richer user interfaces to desktop Linux applications. Building on the technologies that he presented at FOSSCamp, Müller recently published a blog entry that demonstrates his latest impressive experiments with OpenGL, GTK, and offscreen rendering.
Müller is currently developing the GDM Face Browser, a new login manager for Ubuntu that will include enhanced visual effects and smoothly animated transitions. To implement the face browser, he will need to be able to seamlessly combine OpenGL and conventional GTK widgets. Existing canvas libraries like Pigment and Clutter are certainly robust options for OpenGL development, but they do not offer support for the kind of integration that he envisions.
The solution, says Müller, is to use offscreen rendering and the texture_from_pixmap OpenGL extension. In his experimental demo programs, he loads GTK widgets from Glade user interface files, renders them into offscreen pixmaps, and then uses texture_from_pixmap to display the rendered widgets in a GTK/GLExt drawing area, where they can be transformed and manipulated with OpenGL. Müller has created several demo videos that show this technique can be used to apply animated transitions and add reflections to widgets. The visual effects implemented by Müller with GTK and OpenGL do not require the presence of a compositing window manager.
We talked to to Müller to get some additional insight into the challenges and benefits of incorporating OpenGL into desktop applications. One of the major deficiencies of the current approach, he says, is that users will not be able to interact with GTK widgets while they are being animated with OpenGL—a limitation that stems from lack of support for input redirection at the level of the toolkit.
"Interaction will only be possible at the final/original position of the widget," Müller told us, "since gtk+ has no knowledge of the animation/transformation taking place. I consider it to be much work to get clean input-redirection working in gtk+. There might be some ways to achieve it using hacks or work-arounds, but that should be avoided."
Eye candy or UI improvement?
Although some critics might be inclined to prematurely deride Müller's work as indulgent eye-candy, he primarily envisions developers adopting OpenGL integration to tangibly improve the user experience by increasing usability. "I would like to see [OpenGL] being used in applications for transition effects," he says. "We can right now improve the visual clues for users. By that I mean the UI could better inform them what's going on. Widgets [shouldn't] just pop up or vanish in an instant, but gradually slide or fade in. These transitions don't have to take a lot of time. As a rule of thumb half a second would be sufficient."
In particular, Müller would like to experiment with adding animated transition effects to the GTK notebook and expander widgets. He also has some creative ideas for applying animations to the widget focus rectangle in order to make its movement more visible to the user. Müller also discusses some applications that would benefit from OpenGL-based transitions. In programs like the Totem video player, he says, the playback controls in fullscreen mode could slide in and out rather than just appearing and disappearing. Alluding to the webcam demo that he made for FOSSCamp, he also points out the potential for incorporating OpenGL effects into video chat programs like Ekiga. Müller has also long advocated using visual effects to create richer and more intuitive user interfaces for file and photo management software—ideas that he has put into practice with his brilliant LowFat image viewer.
"The kind of effects possible if you can render everything into a texture, map it to a rectangle or mesh and then do further manipulations with GL (or shaders) are next to limitless," says Müller. "Just look at what Compiz allows you to do to windows now. Imagine that on the widget-level."
We also asked Müller to explain the performance implications of using OpenGL in GTK applications. "The memory-overhead is directly linked to the size of the window, since all the rendering has to happen in an OpenGL-context filling the whole window. The bigger the window, the bigger the needed amount of video memory," Müller explains. "The load caused on the system is directly linked to the animation-refresh one chooses. 60Hz would be super smooth and very slick. But that's a bit of an overkill in most cases. One still gets good results from only 20Hz."
There are still some performance issues with the texture_from_pixmap that are actively being resolved. "Due to some issues in Xorg (or rather DRI) there are still too many copy-operations going on behind the scenes for GLX_EXT_texture_from_pixmap," says Müller. "There are also locking issues and a couple of other things. At the moment I cannot name them all with exact details. But more importantly is the fact that there's currently work being done—in the form of DRI2—by Kristian Hoegsberg (Red Hat) to fix these very issues on Xorg. I cannot stress enough how important this DRI2 work is!"
Although using OpenGL incurs additional memory consumption and system load, Müller says that the impact is small enough to make his current approach viable for projects like the Ubuntu Face Browser.
Although individual developers can use Müller's boiler-plate code to incorporate OpenGL integration into their own GTK programs, Müller suspects that support for this technique will not be included directly in GTK at any time in the near future, which will make it harder for developers to adopt. "Right now it is all happening in application-level code and not inside gtk+," Müller explains. "Since I use GLX_EXT_texture_from_pixmap to achieve efficient texture-mapping out of the wigets' pixmap it is currently a X11-only solution. Therefore I imagine they might want to see a more platform-independent solution to this first." The GTK offscreen rendering feature that Müller uses also currently lacks cross-platform support.
Despite the challenges and limitations, Müller's creative work opens the door for some profoundly intriguing user interface enhancements in GTK and GNOME applications. "There are more things possible than you think," says Müller in some concluding remarks for our readers. "Don't have doubts, embrace what's available. X11 and gtk+ are pretty flexible. People who just don't seem motivated to explore ideas and test their limits (or the limits of the framework), should remember that this is Open Source. It lives and thrives only when people step up and get involved. Just f*****g do it!"
Although completing a genome provides science with lots of information, the completion of several genomes provides us with far more than the individual genomes do. Comparisons between the genomes of related organisms can provide us with information about the changes in gene content that accompany major evolutionary transitions. A great example of this is how the sequencing of the Chlamydomonas genome shed light on the origin of plants. Today, Science will be offering up an advanced publication that describes the sequencing of a moss, a relative of Chlamydomonas and descendant of the world's first land plants.
The organism in question is a Bryophyte called Physcomitrella patens. The genome itself is an unassuming 480 Megabases and contains about 36,000 genes. Its significance resides primarily in the fact that Bryophytes are the modern descendants of the first muticellular plants that made their way onto land. If you view Chlamy as lying on the border between algae and animals, you can view Bryophytes as on the border between Chlamy and trees. They are clearly adapted to life on land, but they still need a fairly wet environment, lacking as they do adaptations such as a complex root system and the vascular transport of water.
Physcomitrella itself appears to have only undergone a single whole-genome duplication, in contrast to the multiple rounds of duplications that characterize many of the flowering plants. As a result, there are far fewer duplicated genes and most gene families have fewer members.
Based on its capacity for sending signals between cells, the organism appears partly adapted to the muticellular lifestyle. It contains everything needed to make and use cytokinins, which regulate plant morphology. But it seems to lack other intercellular signaling molecules, such as auxins. It may be able to use ethylene, which flowering plants use to regulate fruit ripening, but the evidence is somewhat sketchy.
It appears to be partly adapted for surviving freezing and desiccation. Like flowering plants, it has a large number of ABC transporters, which reside on cell membranes and help control the flow of material into and out of the cell. It also has enhanced DNA repair capabilities compared to Chlamy, suggesting it can cope with higher exposure to sunlight. In fact, it appears to be well equipped for benefitting from a range of light conditions; the authors say it, "has increased the genetic playground for photosynthesis and connected carbon-based metabolism."
One of the best features of the new genome is that many of the predictions that come out of the genome analysis will be testable. Physcomitrella handles DNA repair in the same way that Yeast does, by homologous recombination. As such, it should be easy to knock out the genes we have now identified and examine the impact that has on its ability to survive in a terrestrial environment.
Science, 2007. DOI: 10.1126/science.1150646
Google boosted its army in the social networking battle last month by announcing OpenSocial, a platform for providing applications and widgets that any site or social network can adopt. This concept of "write once, run anywhere" is certainly an appealing one for third parties looking to get their products in front of lucrative social networking eyeballs. It was clearly also a response to Facebook's open API initiative that it launched back in May, which almost instantly opened the floodgates to a new rush of users and developers that has only increased in size. Though Google and Facebook have been scoring their respective social networking partnerships over the last few months, two separate announcements this week from major web players Bebo and Meebo are the latest to help tip the scales in Facebook's favor.
MySpace and Facebook may dominate social networking traffic in the US, but Bebo holds strong at third place. The company's visibility is likely to increase as well with its new implementation of the Facebook Platform. This will allow developers to build their apps simultaneously for Bebo and Facebook with a minimal amount of fuss—perhaps none at all. Considering that Facebook reportedly had over 32 million unique visitors in October 2007 and Bebo had almost 4.5 million, this new application compatibility can only be a boost to the traffic of both companies. Interestingly though, Bebo also announced its plans to eventually support Google's OpenSocial sometime in 2008, making it (potentially) the first social networking site to embrace both platforms. Still, Facebook's obvious place at the top of Bebo's list will only be a boon to their offerings—while sticking a thorn in OpenSocial's side.
The second pro-Facebook announcement comes from Meebo, the reigning king of web-based chat which now claims over 20 million unique monthly users. Offering a range of products, advertising opportunities, and unique features like co-op games with chat buddies, its announcement of meebo rooms, a Partner Edition custom-tailored for the Facebook Platform will be another major symbiotic win. As a social site, Facebook's integration of a web-based chat leader that allows users to easily share links and play embedded videos will undoubtedly be a boost to traffic and the amount of time users spend at the site. Meebo can even bring its other major partnerships to the chat rooms it enables Facebook with, like the one it made with Rock-A-Fella Records in September to allow users in a room to preview Kanye West's new album in a social atmosphere.
Likely to Google's dismay, Meebo did not mirror Bebo's intentions to also support OpenSocial at a future date.
While third parties and independent developers have been quick to hop on both Facebook's and Google's platforms, these announcements from significant players in other social corners of the web are a major win for Facebook. The social network is also at an advantage due to its being a visible destination with an established user base for developers. Google, by contrast, could see more difficulties in snagging partners due to its OpenSocial platform feeling more like an ambiguous middleman with no major faces to match with its name.
Ultimately though, the battle for the social networking space is just getting started. Google has notoriously deep pockets and a broader grasp on the web, while Facebook valuations are still at an amazing $15 billion. Grab some popcorn; this should be a good show.
A pair of paleontological finds are reported in the literature this week. The first is a newfound genus and species of one of the massive sauropodomorphs that was discovered in Antarctica—only the second Jurassic dinosaur ever found there. The second discovery comes from Africa and represents one of the largest carnivorous dinosaurs ever found.
The latest issue of Acta Palaeontologica Polonica contains an article (open access) that describes the discovery of a massivesauropodomorph. The fossils, which consisted of partial foot, leg, and ankle bones, were foundon Mt. Kirkpatrick near the Beardmore Glacier in Antarctica at an elevation of more than 13,000 feet. Sauropodomorph dinosaurs were gigantic herbivores, and were the predecessors of the more well known Sauropods. They are also closely related to—in evolutionary terms—theropods, which include Tyrannosaurus, Velociraptor, and modern birds.
Currently in paleontology there is open debate about the evolutionary relationship and development of sauropodomorph. This find establishes that the sauropodomorph dinosaurs were more widely spread, existing in the Americas, China, South Africa, and now Antarctica. Secondly, in conjunction with a prior Antarctic find of an early sauropod, it suggests thatGlacialisaurus hammeri—the new species—coexisted with true sauropods during the late Triassic and early Jurassic.
The second bit of dino-related news comes from a find not near a pole, but from just above the equator in the African nation of Niger. The fossils of interest here are not new—not that any fossil really is—they were discovered during an expedition in 1997 by University of Chicago paleontologist Paul Sereno. These fossils were identified as belonging to a new species, Carcharodontosaurus iguidensis, by a University of Bristol graduate student. This species would have been one of the largest carnivorous dinosaurs to have ever walked the earth—measuring in at 13-14 meters long, it would have been taller then a double decker bus.
Fossils of this species have an interesting history. The first known fossils of Carcharodontosaurus were found in the 1920s and consisted of two teeth—each the size of bananas. However, in the intervening decades, these relics have been lost. Other remains were found in Egypt in the 1930s, but were destroyed in the bombing of Munich in 1944. A skull of the dinosaur was found about a decade ago in the Moroccan Sahara. This leaves little evidence for scientists to go with, but it does illustrate that a number of species of theropods were living simultaneously in Africa around 95 million years ago. The current set of fossils consists ofseveral pieces of the skull—parts of the snout, lower jaw, and braincase—as well as part of the neck. The findings are reported in this week's edition of theJournal of Vertebrate Paleontology.
 Acta Palaeontologica Polonica, 2007. 52 (4): 657-674
 Journal of Vertebrate Paleontology, 2007.
The House today held a hearing on the new PRO-IP Act that beefs up intellectual property enforcement. Rick Cotton, a top NBC lawyer and representative for the Coalition Against Counterfeiting and Piracy (CACP), called counterfeiting and piracy "a global pandemic" and "a dagger into the heart of America's future economic security." What the US needs, he said, is a "declaration of war." But not even the Department of Justice is convinced that PRO-IP, in its current form, is that sort of declaration.
Counterfeit goods are certainly a problem, and no one at the hearing stands opposed to crafting good intellectual property law to protect creative work and new products (even Public Knowledge's Gigi Sohn proclaimed her support for IP law and enforcement).
Rep. Darrell Issa (R-CA), who made money in the car alarm business and was the voice of the "Viper" system, used his opening statement to tell his fellow representatives about how other companies ripped off his products, including his voice, and sold them in the US market. Defective products would arrive at Issa's company that he had not even manufactured, though in the minds of customers, his company was to blame. Will PRO-IP help to fix such problems?
Concerns from Justice
The PRO-IP Act seeks to stem the "tsunami" (as one representative put it) of counterfeitingand piracy by making a pair of changes to the structure of the federal government. First, a new executive branch office devoted to intellectual property enforcement would be created in the White House, and it would be modeled on the office of the US Trade Representative. The Department of Justice would also get a new IP enforcement division that would consolidate work currently done in several other divisions.
Sigal Mandelker, a Deputy Assistant Attorney General at the DOJ, told the subcommittee that this plan raised some concerns at Justice. For one thing, having a White House office that can direct the priorities and investigations at Justice could undermine the independence of the department, she said.In addition, the current arrangement at Justice is "actually quite effective."
Public Knowledge weighs in
Other concerns came from Gigi Sohn, president of Public Knowledge, who attacked the PRO-IP Act's increase to the statutory damages that can be leveled for copyright infringement.Referencing the Jammie Thomas case in Minnesota, Sohn noted that statutory damages are already "disproportionate penalties for infringement," and called on Congress to move them in the other direction.
Despite several significant criticisms of the bill, Sohn said that she was pleased with how subcommittee chair Howard Berman (D-CA)listened to many different stakeholders and had already removed the most egregious provisions from the bill.
"Unslakable lust for more"
Google's senior copyright counsel, William Patry, wasn't at the hearing, but he had a far less charitable take on the legislation. Calling it the most "outrageously gluttonous IP bill ever introduced in the US," Patry made clear that he was appalled by the "unslakable lust for more and more rights, longer terms of protection, draconian criminal provisions, and civil damages that bear no resemblance to the damages suffered."
One might expect that coming from a Google lawyer (the blog is written in his private capacity), since the company is a voracious consumer of copyrighted work, but Patry has himself served in the Copyright Office and has written perhaps the definitive seven-volume tome on the subject of US copyright law. Instead, he says, he is "pro-IP in this most important of senses. But an excessive amount of something that is beneficial in measured doses can become fatal in overdoses, and copyright is already at fatal strength."
The PRO-IP Act, with its attempt to increase statutory damages andincrease forfeiture penaltiesfor equipment used for copyright infringement, clearly moves in a way that Patry dislikes. Fortunately, when it comes to criminal matters, Justice remains steadfastly unconcerned with prosecuting minor infringement cases, as Mandelker again made clear in response to a question.
Still, with even harsher laws on the books, there's always a chance that thepenalties won't hit only those who import ripped-off car alarms, but a huge array of ordinary Americans. Where penalties are needed, they should fit the crime. Ruining someone's financial lifeover the equivalent of a box of CDs or DVDs hardly seems to meet that standard.
AMD held its annual Financial Analyst Day this morning. This time around, there was no glitter, no flash, and no rosy pep talks about the current (or future) dominance of AMD products in the marketplace. The various corporate executives who spoke, including Hector Ruiz (CEO), Mario Rivas (executive vice president, Computing Products Group), and Dirk Meyer (president, chief operating officer), acknowledged the difficulties AMD is currently experiencing, admitted that the company's execution had slipped badly over 2007, and pledged that 2008 would be different. AMD's word of the day was "apologize," and multiple executives expounded on the theme. Future-casting was kept to a bare minimum, and most of the information discussed is already common knowledge in technical circles.
That's not necessarily a bad thing given AMD's current financial position and the company's desire to strike a different tone with the financial industry. Admitting the truth of where it stands and the need for change paints the company as an honest one that's willing to give real information on its operations, even when that information isn't good. The best way to follow up on such statements, however, is to deliver realistic good news about what's expected in 2008. Unfortunately, that didn't really happen, and the statements and projections AMD didn't make resonate more strongly than those it did.
According to AMD, platform-based solutions will remain a major focus for the company thanks to strong demand for them from its customers. Sunnyvale also gave some hard numbers on current quad-core shipments, stating that it shipped 34,000 Barcelona cores in the third quarter, expected to ship "hundreds of thousands" by the end of the fourth quarter, and would double Q4 shipments in the first quarter of 2008. The company also clarified what the exact state of Barcelona shipments is at the moment. Barcelona parts are shipping, but only to specific customers, and only in situations where AMD is able to work with the company to ensure that the TLB erratum will never be encountered. Phenom shipments will continue, but major OEMs aren't expected to offer complete systems based around the part until the end of the first quarter or the beginning of the second. This implies that most manufacturers are largely passing on Phenom until its TLB erratum is completely resolved, though AMD did not make that particular correlation.
AMD projects that it will ship Phenom and Barcelona parts in a 3:1 ratio through the first quarter of the year. Around the end of Q1/beginning of Q2, the new "B3" stepping of the K10 core should be available in volume. Once this occurs, Barcelona production and availability will be ramped, and the processor will be made available in volume to Fortune 500 companies.
That's basically all AMD had to say about Barcelona and K10. While the tone of the meeting would've made aggressive rose-colored predictions unpalatable, AMD's decision to say so little about what we can expect from K10 in 2008 was surprising. The company did discuss its transition to 45nm process technology, stating that samples were set to be delivered in January, with volume ramp beginning in the second half of 2008, but that projection is short enough to raise some eyebrows; it'll be surprising if AMD can switch to 45nm that quickly. AMD is working on a 45nm, octal-core K10.5 with 6MB of L3 per cache and an MCM approach (two quad-cores per die), but the die itself has not been produced on actual silicon—at least not yet.
Good vibes on ATI
ATI was the only real bright spot of the day. That particular segment of AMD is downright bullish in its expectations for 2008, and intends to challenge NVIDIA in the mid-range and high-performance desktop segments while simultaneously retaking market share in the notebook segment. New midrange and budget GPUs based on the RV620 and RV635 will be available in the first quarter of next year, and the company's RV680 (dual X3870 GPUs on one PCB) should debut relatively early in the year as well.
ATI also announced two new capabilities that will come with next-generation video cards and integrated chipsets. Going forward, integrated chipsets built on the 780G platform will be able to increase overall video performance by plugging in a budget or midrange GPU that will work in concert with the already integrated GPU to boost performance. Think of the combination as a weak Crossfire solution, but one that actually makes some sense; integrated users who upgrade to even a budget GPU will see a greater performance boost than they previously would've. ATI hasn't revealed much about this technology yet, but the company claims that it will function best with lower-end cards. As performance becomes increasingly asymmetric between the integrated GPU and the discrete part, the overhead created by enabling Crossfire inevitably overwhelms the advantage of using it.
The other announcement from ATI today is that it will begin shipping DisplayPort capable cards this year. DisplayPort is designed as an alternative to HDMI, and uses fiber optic cable rather than twisted copper. This allows a display to use a much longer signal cable before image quality begins to degrade. There are no DisplayPort-capable monitors shipping at this time, but various companies including Dell, Samsung, IBM, and Lenovo have all said that they will be adopting the standard in the future.
If not for ATI's recent resurgence, AMD's event today would've come across as downright depressing. AMD's presentations and speeches were meant to demonstrate both an acceptance of the company's current position as well as a determination to turn things around and put the CPU manufacturer back on track. Unfortunately, neither determination nor admission of accountability are easily converted into cash. Today's event didn't seem to be the work of a company that's expecting much good news in the fourth quarter, and the lack of information on upcoming K10 improvements cast doubt on that processor's ability to carry the company financially.
Ben's been doing all the talking on the Child's Play front as of late, but now it's my turn to say that you guys have been doing an amazing job when it comes to raising money for this great cause. The Opposable Thumbs Child's Play Drive has now passed the $5,000 mark and the donations are continuing to pour in at an awe-inspiring rate. We asked you to dig deep, and a lot of you have: there's going to be some happy kids this holiday season, that's for sure.
We've still got our big personal prize pack borne from our personal video game collections to come, so keep those donations coming in. Ben has hinted at it, but I'm willing to be more forthcoming: some of the stuff in the big pack you simply will not be able to find easily or cheaply—if at all. My out-of-pocket contribution, in all its white, pearly, non-Nintendo-related glory, has left a stinging hole where my heart used to be, but it's for a good cause that I'm willing to make just such a sacrifice. Expect the final unveiling soon.
For those who haven't been following the Child's Play Charity run this year outside of our drive here, the Penny Arcade guys have voiced extreme satisfaction with this year's turnout thus far. The total donations were up to $600,000 prior to last night's Child's Play Gala Event, which managed to raise an incredible $225,000 more. The night's auction starred some incredible impromptu acts of generosity, as Bungie sweetened its offerings with the Bungie employee-exclusive, in-game Recon Armor suit for Halo 3 and Valve dished out the desirable Portal-born Weighted Companion Cube as well as a tour of the office. "And the throng responded—fiscally," Tycho remarked.
It's been a great year for the charity, but that should come as no surprise. Tycho put it so: "That you can generate million dollars in just a couple months—on an annual basis—is now, to me, quite ordinary. You are incredible, and changing the world is easy for you. I should have understood that from the start."Keep the donations coming, everyone. Let's see if we can't make 2007 not only a memorable year for gaming, but an unforgettable year for giving.
I'll admit it. I'm kind of a sucker for end-of-the-year top 10 lists. They don't only have to be "best of" lists either; sometimes the "worst of" lists are even more enjoyable. That's why I was mildly excited to see Popular Mechanics' The Top 10 Worst Gadgets of 2007. If you are the sensitive Mac user, one to get angry about people saying negative things about your platform of choice, or one that follows the company with an enthusiasm that can only be described as zealot-esque, you should just stop reading now.
The list starts out innocently enough: number 10 is a vibrating exercise machine straight out of the 1960s, number eight is everyone's favorite brown portable media player (the Zune), and number six is some sort of talking WiFi rabbit. Number four is, of course, the ubiquitous, yet-to-be-released robotic dinosaur (the Pleo), and number two is none other than the Apple TV.
That's right, the darling of so many year-end top 10 lists, Apple Computer, has contributed to a worst of list. According to the article's blurb on the Apple TV, the high (or low, as the case may be) ranking doesn't stem from what is wrong with it, but instead that "there’s nothing overwhelmingly right about it, either." Popular Mechanics argues that in a crowded field of similar devices, it just doesn't shine. The magazine also argues that the content just isn't there. We would like to add the inability to purchase content from the device and no high-definition media to the reasons why the Apple TV is lacking.
We know that you're wondering what is the number one worst gadget of 2007, and we're glad you asked. It's none other than the Palm Foleo. "But the Palm Foleo never shipped!" you say. You are correct. That makes the Apple TV the worst own-able gadget of 2007. Now in Popular Mechanics' defense, it did put the iPhone on its Top 10 Most Brilliant Gadgets of 2007 list, and the magazine based its choices of worst gadgets on missed opportunities (and not overall terribleness). That may make it a little easier for some of you to swallow.
Google's $3.1 billion acquisition of DoubleClick has drawn a lot of scrutiny on both sides of the Atlantic it was announced last April, and continues to do so as the year draws to a close. Consumer groups have expressed their concerns over the privacy implications of such a deal, since both companies have harvested massive amounts of consumer data in order to build their advertising networks. Rep. Joe Barton (R-TX) believes the companies haven't been forthcoming about how they'll address those concerns, leading him to demand answers in a letter sent to Google CEO Eric Schmidt earlier in the week week.
In a copy of the letter seen by Ars, Barton reminds Schmidt that the CEO promised to help Barton and his staff better understand the company's search and targeted advertising practices. According to the letter, when Barton tried to send staff to Google's campus to discuss the matter, they were turned away. "Since then, all efforts to reach a mutually agreeable time have been rebuffed, and it begins to seem that no date for a visit is sufficiently convenient to Google," wrote Barton. "Your warm initial invitation followed by Google's chilly response to a proposed visit by Committee counsels is disconcerting." (Google disputes that Barton has been rebuffed—a spokesperson said that the chosen dates were merely inconvenient, that no alternatives were provided, and that Barton's committees could come at any time. )
Throughout the rest of the missive, Barton takes Google to task over its search practices and algorithms, data retention, data analysis, advertising programs, how the search giant anonymizes certain data, cookie use, and more. Overall, the 24-question letter asks Google to describe its practices in great detail, and requests a response as early as next week.
In the meantime, others have called into question the Federal Trade Commission's ability to investigate the merger impartially (the investigation began in May). The Electronic Privacy Information Center—one of the organizations that filed the initial complaint with the FTC—has asked (PDF) that FTC Chairman Deborah Majoras be recused from the investigation after discovering her husband's involvement with Jones Day, a law firm that has represented DoubleClick in the past. The husband, John Majoras, told the Mercury News that he was not involved in the deal between DoubleClick and Google, although the FTC is reportedly already discussing the issue with its ethics officer.
Update: Majoras and the FTC have decided that she will not recuse herself from the merger approval process.
Academic research can be a messy thing. In contrast to the carefully formatted and argued publications that result, the raw material is often a mass of annotated documents, hastily taken notes, and scattered references. The Center for History and New Media at George Mason University thinks that this raw material could be just as useful to the wider research community as the final publication. They've now secured a grant from the Andrew W. Mellon Foundation to create a system for uploading it into a database at the Internet Archive. The catch? The academics have to be organizing that material using Zotero, a Firefox plug-in.
Zotero is actually an impressive piece of software. It integrates into Firefox, allowing researchers to save and organize the material they access through the Web. But its library can also contain references to files stored locally, as well as free-form notes. Users can annotate any of these documents or tag them for future searches. It also formats citations for publication and integrates reference management into Word via a plug-in. In this way, it acts much like commercial reference managers such as the cross-platform EndNote or Papers on OS X.
With the new grant, the Center for History and New Media will be getting half a million dollars to enable Zotero to send the notes and other material on to the Internet Archive. The Archive will be getting another $700,000 to handle the material as it comes in and make it accessible to the academic community via a searchable database.
According to the Chronicle of Higher Education, academics will be encouraged to share in part by the convenience of the system. If someone is already using Zotero, it will just take a few extra clicks to place their material online. A carrot will come in the form of OCR software at the Internet Archive; scans of handwritten notes and original material will be sent back to researchers as editable text.
There are a couple of serious practical issues with this plan. The first is that it's limited to academic users of Firefox that have decided that Zotero is a compelling solution—in short, a small subpopulation of the research community. The second is that some of the material that is uploaded will undoubtedly run into the same issues regarding copyright and fair use that plague any information sharing site.
Will the material that's uploaded be of any value? Based on my personal experience, the answer here will be mixed. I've taken notes and made annotations for everything from peer-reviewed publications to articles for Ars, but only a fraction of the ideas ever make it into the publication. Within the remainder, there are some genuine insights that don't make the cut due to a lack of direct relevance or space constraints. But there are also a lot of spur-of-the-moment thoughts that I later reject due to further reading or analysis. Unless all contributors are careful about what they upload, this effort may produce a storehouse of bad ideas.
When most people think of a high-end laptop chassis, they think of titanium, or possibly the brushed aluminum that's currently popular. If Asus' prototype attempts prove successful, however, the next high-end, must-have chassis material on a laptop could be bamboo. At first glance the idea sounds quite odd, but bamboo actually possesses a number of characteristics that could make it suitable for housing a computer.
Bamboo is naturally flexible, durable, and extremely strong; laminates can be applied to the material to shape or strengthen it further. It grows rapidly and abundantly, and could theoretically serve as an ecologically renewable resource for building laptop (and I suppose desktop) chassis. Asus has yet to define the specs on their Eco Book (as the product is called), but the evaluation is ongoing. At present, even if the Eco Book proves successful, it'll remain a high-end option aimed at executives who might otherwise be in the market for a leather-bound or alligator-skin notebook. With bamboo mice, keyboards, and monitor frames already on the market, it's not hard to see where a bamboo laptop might fit as well.
Bamboo-based products may be more eco-friendly and less likely to pollute the environment once the system has been disposed off, but I'd wager that even companies interested in deploying such products would want to conduct aggressive long-term durability tests, first. Consumers would have to be assured that a bamboo laptop wouldn't be more likely to chip, break, or crack than a metallic chassis under similar stress. That might be a tougher sell in Western markets, where bamboo isn't widely used as a building or scaffolding material, but it's certainly possible if manufacturers can create a plausible long-term chassis.