One of the incredibly useful things about Mac OS X in general is the potential for integration between applications. Type the name of an Address Book contact in Gus Mueller's VoodooPad and it'll link the name, offering a useful contextual menu. Collect icons in CandyBar? Right click one and you can set it as your iChat avatar with the option of applying any of Leopard's new image effects. And let's not even get started on the power that AppleScript and its mortal-friendly Automator enable for moving and manipulating data between applications.
It is with this integration in mind that some new features in a couple of Mac OS X e-mail clients deserve a highlight, as they're fairly game-changing developments for those who have to work with mail on a regular basis. First is the discovery of Leopard Mail's support of message URLs, explored in-depth by John Gruber at Daring Fireball. Though the new feature is strangely undocumented by Apple, users have discovered that Mail now supports a system of URLs (yes, URLs can do more than point to porn) that allow you to link specific messages in other applications. For example, you could include links to a couple Mail messages from coworkers alongside notes, pictures, and web links in OmniOutliner or Yojimbo documents. This opens up a whole new productivity world, allowing you to bring your e-mail into other applications that aren't specifically designed to integrate with Leopard's Mail.
To help make it easy for users to harvest these message links (as of 10.5.1, Mail doesn't provide an option, and not all applications create the proper Mail message URL from a simple drag and drop yet), Gruber includes the code for a simple AppleScript at the end of his post. Save that script with Script Editor (found in /Applications/AppleScript/) and call it via any number of methods, such as Mac OS X's own AppleScript menubar item, Red Sweater's FastScripts, or launchers like Quicksilver and LaunchBar. The newest Leopard version of indev software's MailTags plug-in for Mail also provides a dedicated menu option for copying a message URL.
If this integration has your productivity gears turning, but Gmail is your client of choice, Mailplane could offer a nice compromise. As a desktop Gmail browser that allows for things like drag-and-drop file attachment and even an iPhoto plug-in for e-mailing photos, Mailplane is more or less a bridge between the convenience of webmail and the integrated power of desktop clients.
New in the most recent private betas of Mailplane (1.55.4 and above) is a similar URL system for Gmail messages which appears to work on both Leopard and Tiger. Complete with an Edit > Copy as Mailplane URL option, this option allows users to paste custom mailplane:// URLs in other applications to bring mail out of Gmail and into their productivity workflows. Remember, though, that Mailplane is still a browser for Gmail, albeit with the aforementioned modifications and other useful things like Growl notifications and support for multiple accounts (including Google Apps). Since it isn't an offline mail client, you'll still need to be online for a Mailplane URL to connect to its corresponding Gmail message.
Still, these new message URL features in two useful Mac e-mail clients will likely see some official integration love from other third-party apps in the near future. Aside from DIY AppleScripts, apps like Yojimbo and TextMate can only benefit from being able to include e-mail in the productivity mix. Knock knock third parties—how's about it?
Google's Chinese business has been consistently questioned and criticized, but now a Chinese company has taken issue simply with its name.
Beijing Guge Sci-Tech is suing Guge, or Google China, claiming that the Internet search giant is tramlping on its good and, perhaps most important, registered Chinese Mandarin business name. Guge Sci-Tech registered its name in 2006, a few months before Google did. Now the tech company wants Google to change the name of its Chinese branch and pay an undisclosed sum to cover all its legal fees.
Beijing Guge Sci-Tech registered its name at the Beijing Municipal Industrial and Commercial Bureau on April 19, 2006, and Google followed with registering "Guge" on November 24 that same year. This similarity in names, Beijing Guge Sci-Tech argues, has confused the public and damaged its business.
However, if Google was considering the use of the word before Beijing Guge Sci-Tech registered, it could work in Google's favor as the company has clearly registered its name in good faith. And, for all we know, the Chinese-based Guge may be little more than a trademark registration or a cybersquat, as information on the company is extremely hard to come by. Google China suggests that Guge Sci-Tech is indeed looking for an easy payout, perhaps having picked up on Google's plans by paying attention to Western media. Everyone knew Google would be changing its name in China, as "Goo-Gol" means "old" or "lame dog."
Also at issue between the companies is the definition of guge, which is not a normal Chinese word. Google says its a combination of Chinese characters that mean "valley" and "song"—a reference to Google's Silicon Valley ties. Beijing Guge Sci-Tech disagrees, stating the word means "a cuckoo singing in the spring, or the sound of grain singing during the harvest autumn time."
At FOSSCamp in October, skilled eye-candy expert Mirco Müller (also known as MacSlow) hosted a session about using OpenGL in GTK to bring richer user interfaces to desktop Linux applications. Building on the technologies that he presented at FOSSCamp, Müller recently published a blog entry that demonstrates his latest impressive experiments with OpenGL, GTK, and offscreen rendering.
Müller is currently developing the GDM Face Browser, a new login manager for Ubuntu that will include enhanced visual effects and smoothly animated transitions. To implement the face browser, he will need to be able to seamlessly combine OpenGL and conventional GTK widgets. Existing canvas libraries like Pigment and Clutter are certainly robust options for OpenGL development, but they do not offer support for the kind of integration that he envisions.
The solution, says Müller, is to use offscreen rendering and the texture_from_pixmap OpenGL extension. In his experimental demo programs, he loads GTK widgets from Glade user interface files, renders them into offscreen pixmaps, and then uses texture_from_pixmap to display the rendered widgets in a GTK/GLExt drawing area, where they can be transformed and manipulated with OpenGL. Müller has created several demo videos that show this technique can be used to apply animated transitions and add reflections to widgets. The visual effects implemented by Müller with GTK and OpenGL do not require the presence of a compositing window manager.
We talked to to Müller to get some additional insight into the challenges and benefits of incorporating OpenGL into desktop applications. One of the major deficiencies of the current approach, he says, is that users will not be able to interact with GTK widgets while they are being animated with OpenGL—a limitation that stems from lack of support for input redirection at the level of the toolkit.
"Interaction will only be possible at the final/original position of the widget," Müller told us, "since gtk+ has no knowledge of the animation/transformation taking place. I consider it to be much work to get clean input-redirection working in gtk+. There might be some ways to achieve it using hacks or work-arounds, but that should be avoided."
Eye candy or UI improvement?
Although some critics might be inclined to prematurely deride Müller's work as indulgent eye-candy, he primarily envisions developers adopting OpenGL integration to tangibly improve the user experience by increasing usability. "I would like to see [OpenGL] being used in applications for transition effects," he says. "We can right now improve the visual clues for users. By that I mean the UI could better inform them what's going on. Widgets [shouldn't] just pop up or vanish in an instant, but gradually slide or fade in. These transitions don't have to take a lot of time. As a rule of thumb half a second would be sufficient."
In particular, Müller would like to experiment with adding animated transition effects to the GTK notebook and expander widgets. He also has some creative ideas for applying animations to the widget focus rectangle in order to make its movement more visible to the user. Müller also discusses some applications that would benefit from OpenGL-based transitions. In programs like the Totem video player, he says, the playback controls in fullscreen mode could slide in and out rather than just appearing and disappearing. Alluding to the webcam demo that he made for FOSSCamp, he also points out the potential for incorporating OpenGL effects into video chat programs like Ekiga. Müller has also long advocated using visual effects to create richer and more intuitive user interfaces for file and photo management software—ideas that he has put into practice with his brilliant LowFat image viewer.
"The kind of effects possible if you can render everything into a texture, map it to a rectangle or mesh and then do further manipulations with GL (or shaders) are next to limitless," says Müller. "Just look at what Compiz allows you to do to windows now. Imagine that on the widget-level."
We also asked Müller to explain the performance implications of using OpenGL in GTK applications. "The memory-overhead is directly linked to the size of the window, since all the rendering has to happen in an OpenGL-context filling the whole window. The bigger the window, the bigger the needed amount of video memory," Müller explains. "The load caused on the system is directly linked to the animation-refresh one chooses. 60Hz would be super smooth and very slick. But that's a bit of an overkill in most cases. One still gets good results from only 20Hz."
There are still some performance issues with the texture_from_pixmap that are actively being resolved. "Due to some issues in Xorg (or rather DRI) there are still too many copy-operations going on behind the scenes for GLX_EXT_texture_from_pixmap," says Müller. "There are also locking issues and a couple of other things. At the moment I cannot name them all with exact details. But more importantly is the fact that there's currently work being done—in the form of DRI2—by Kristian Hoegsberg (Red Hat) to fix these very issues on Xorg. I cannot stress enough how important this DRI2 work is!"
Although using OpenGL incurs additional memory consumption and system load, Müller says that the impact is small enough to make his current approach viable for projects like the Ubuntu Face Browser.
Although individual developers can use Müller's boiler-plate code to incorporate OpenGL integration into their own GTK programs, Müller suspects that support for this technique will not be included directly in GTK at any time in the near future, which will make it harder for developers to adopt. "Right now it is all happening in application-level code and not inside gtk+," Müller explains. "Since I use GLX_EXT_texture_from_pixmap to achieve efficient texture-mapping out of the wigets' pixmap it is currently a X11-only solution. Therefore I imagine they might want to see a more platform-independent solution to this first." The GTK offscreen rendering feature that Müller uses also currently lacks cross-platform support.
Despite the challenges and limitations, Müller's creative work opens the door for some profoundly intriguing user interface enhancements in GTK and GNOME applications. "There are more things possible than you think," says Müller in some concluding remarks for our readers. "Don't have doubts, embrace what's available. X11 and gtk+ are pretty flexible. People who just don't seem motivated to explore ideas and test their limits (or the limits of the framework), should remember that this is Open Source. It lives and thrives only when people step up and get involved. Just f*****g do it!"
As if to beg on its knees "dear Jebus, please make Mac users care about DivX!" the company is once again offering its DivX Pro software for free in exchange for your e-mail address. For those who don't know (and we honestly don't blame you), DivX is an alternative video codec primarily used in pirated video files and some DVD players from the likes of JVC, LG, Samsung, and Sony. Now, before you start slamming my slam, do note: I said primarily, not only. That said, historians have so far been unable to decipher why a portion of computer users felt that we needed Yet Another Video Codec™, though current theories suggest that some people simply have fun endlessly installing pieces of extra software.
But seriously, we know that there are some of you who apparently enjoy DivX. In addition to the free DivX standalone player and DivX Community Codec which allows QuickTime (and any QuickTime-based applications like Front Row and I think even iTunes) to play DivX files, the DivX Pro package includes two key components that normally cost $19.99. The first is a DivX Converter application that can handle batch conversion to the DivX format, with a clever feature of generating HTML code for easy embedding on the web. The second component is the DivX Pro Codec itself for creating DivX files from QuickTime-compatible apps like iMovie, Final Cut Pro, Adobe Premiere, and After Effects.
Ultimately, giving up your e-mail (which remains private; I jumped on this deal last time) for $20 worth of software is a good trade, but it's only offered for a limited time (with no actual word on when that limit is up). If you need to create DivX files for one reason or another, ye best be actin' sooner as opposed to later laddy.
In the never-ending war between security researchers and malware authors, each side continually attempts to outmaneuver or out-engineer the other. The latest security threat to hit the white hat radar involves a new form of system-level DNS hijacking. DNS hijacking, in and of itself, is nothing new, but it's now apparently possible to reliably initiate such attacks using web-based malware, rather than relying on an end-user to download or activate a suspicious attachment.
According to a recent report by PCWorld, research teams working out of Google and the Georgia Institute of Technology have discovered a series of open-recursive DNS servers that were classified as behaving "suspiciously." Open-recursive DNS servers are DNS servers that will answer any lookup request, no matter where it originates. So long as the DNS servers return accurate information—and the vast, vast, majority do—everything is kosher. When open DNS servers don't return valid information, however, they open the door to an entire world of problems.
Poisoning a DNS server allows the malware author to send your computer virtually anywhere he wants. Since your system is being driven to false web sites based on DNS information, there's no way for any malware suite running locally to detect or report on the problem—at least, not once the damage has been done. There are still limitations on what can be done; a false web site set up to look like PNCBank (for example) wouldn't be able to authenticate with the SSL certificate stored on a users' system. Password and logon information could still be gathered in other ways, however, and some users would undoubtedly ignore warning signs by trusting the web address telling them they really were at (www.securesite.com).
Web 2.0 can act as something of an enabler in this process. Webpage mashups may be a hot marketing term, but pulling content from multiple web sites simultaneously is also one means of infecting the people that visit a site without them knowing what vector the attack initiated from. Fortunately, there are already some solutions to this particular problem.
Vista's UAC would actually defend a system from this type of attack by notifying the user that a program was attempting to change the system's DNS settings. I'm not sure if current malware software from various vendors would detect and prevent DNS-level hijacking, but again, such protection and notification could be implemented on a software level. The availability of user-level protection is by no means a complete solution to the problem; software companies cannot assume that all users avail themselves of the appropriate level of malware software or install the appropriate patches, but it is a place to start.
DisplayLink announced today that it has partnered with Alereon to deliver wireless USB displays. DisplayLink specializes in using USB2 as an efficient means for hooking multiple displays to the same computer without requiring an investment in multiple video cards. Releasing a wireless USB monitor is quite a step for DisplayLink—the company's wired USB2 display technology was released less than a year ago—and it may prove more popular than the company's current system, which requires either a DirectLink-enabled monitor (at a price premium) or an RGB-USB2/DVI-USB2 adapter that apparently sells for around $90.
According to DisplayLink, there will be no performance penalty for using a wireless monitor, and displays will function at resolutions of up to 1680×1050 at 16.7 million colors. The actual wireless USB chip is produced by Alereon—that's where the partnership comes in—but both companies seem confident that they can deliver a high-quality visual experience without noticeable tear or artifacts. DisplayLink's products aren't suitable for gaming or other functions that depend on split-second reaction times, but the company claims that DVD playback is flawless, even over its new wireless USB2 technology.
By all accounts, DisplayLink's current wired USB2 technology works quite well at resolutions of up to 1600×1200, and a wireless solution could definitely increase the concept's attractiveness, particularly in system deployments where space is at a premium. DisplayLink hasn't stated how, exactly, monitors will interface with its wireless system. Ideally, the company could offer a small transmitter/receiver unit that plugs into a monitor's DVI cable, with a similar device hooked into the back of the computer. If that's not possible, end users will have to pony up for a display that specifically incorporates the DisplayLink/Alereon wireless technology at a price premium.
The Government Accountability Office has just released a report to Congress (PDF) slamming the efforts of the FCC and the NTIA to alert consumers about the upcoming transition to digital television. The report finds that, despite plenty of work done by various agencies and private organizations, "no comprehensive plan exists" to manage the entire transition.
If it's difficult to explain the February 2009 digital TV transition to consumers, it's doubly difficult when hundreds of different groups are involved. The GAO points out that over 160 "business, trade, grassroots, and other organizations" are involved with the Digital Television Transition Coalition. The group, which is trying to get the word out so that angry seniors don't beat down their doors, also needs to coordinate with the two government agencies involved with the transition.
And that level of coordination demands a Plan, a Plan complete with milestones, key goals, and risk mitigation scenarios. In other words, a long and boring document. But as FCC Chairman Kevin Martin admitted in an interview with GAO auditors, the FCC has no such formal plan. Instead, said Martin, "the various orders contained in the FCC dockets amount to a plan."
Yes, I am. But is the government?
The GAO disagrees. And without an overarching plan,the GAO worries that government and private industry will run into coordination problems. "Complicating matters is uncertainty regarding retailer participation and readiness and potential challenges related to inventory planning," says the report. "With limited or delayed retailer participation, consumers might face difficulties in redeeming their coupons for eligible converter boxes during the designated time period." And if the converter box program turns into a debacle, things could get ugly.
Now, the FCC and the NTIA have all done plenty of work; the FCC has even launched an ugly but functional web site (complete with Netscape favicon) for consumers. Private industry, too, has agreed to spend millions promoting the transition, with the Consumer Electronics Association, the National Association of Broadcasters, and the National Cable & Telecommunications Association all chipping in.
But the GAO wants coordination and milestones, and its report issues guidance for how to get such a Plan together. The official responses to this advice have been interesting. Kevin Martin sent a letter to the GAO complaining about the report's approach and conclusions, but didn't bother to indicate why. Instead, he spent most of the letter complaining that the GAO would not include a lengthy FCC document in the report (the GAO has put it up online).
The Department of Commerce, which runs the NTIA, acknowledged that simply relying on voluntary industry participation for such a crucial campaign had certain risks, but Commerce was not at all convinced that establishing a "digital transition czar" was the right answer. The GAO drily notes that "we did not recommend establishing a digital transition czar" and that Commerce offered no comment on the actual recommendations made.
So, with the government agencies apparently unwilling to address the actual recommendations made by the GAO, it doesn't look like a full-fledged Plan will be forthcoming. Let's hope the transition goes smoothly without one. If not, you stock up on the pitchforks, I'll collect the torches, and we'll meet at the FCC on a cold day in 2009 when the glow from our TVs turns to static. (Note: I have an ATSC-ready TV, so I won't actually be there. But I'll be thinking of you! Stay warm!)
Though the debates around .Mac's usefulness, reliability, and shortcomings show no sign of cooling, analyst firm NPD Group says that the service is now Apple's second-best seller in retail stores (it has even ranked with AppleCare at number 6 in the online store). Considering that the boxed version of .Mac contains not much more than a license code on a slip of paper, this Software-as-a-Service (SaaS) development is impressing the tech web.
As Computerworld points out, NPD Group says software sales (both on and offline) are (surprisingly) up by 10 percent this year, despite a decided lack of shelf space at major retailers like Best Buy and Circuit City. Packages like security software, Adobe CS3, and Office 2007 (which, by itself, apparently counts for one out of every six dollars spent on software this year) are all to thank for the upswing in sales. An exploration of a forthcoming surge in SaaS offerings could lead to more software sales success—even if the physical boxes are full of air. AdventNet, for example, is gearing up to start selling $50 boxes of one-year subscriptions to its Zoho online office suite, which ironically beat competitor Google Docs to working offline with Google's own Google Gears Firefox plug-in.
Apple is, in part, credited with the smarts to "go retro" some time ago with its efforts to sell SaaS in a physical box when software was going online and downloadable. Offering .Mac in a box gives customers something physical to wrap both their hands and minds around, and it also lends itself well to things like being gifted by a friend or family member.
Still, the matter of .Mac's number 2 retail sales spot is pretty significant, and might lead to more support and feature updates. As a happy .Mac member through thick and thin for the past four years, I would certainly like to see Apple do more with the service, now that its benefits finally seem to be taking off with the general customer.
This news will come as a shock to none, but the volume of spam has continued to rise throughout 2007. So much so, in fact, that spam researchers say that electronic junk mail has long surpassed the volume of human-issued e-mail this year, despite efforts to thwart it. One company, Barracuda Networks, goes so far as to say that spam now accounts for 90 to 95 percent of all e-mail, with no end in sight.
The numbers come as part of the e-mail security company's annual spam report, in which it analyzed over one billion messages sent to its 50,000 customers. Barracuda says that the percentage of spam increased from 85-90 percent in 2006, and is way up from 5 percent back in 2001. After conducting a poll of 261 business professionals, Barracuda also found that over half—57 percent—consider spam to be the "worst form of junk advertising," almost double that of junk snail mail. Only 12 percent cited telemarketers as the worst.
95 percent is awfully high (and as far as I can tell, accurately describes the ratio of e-mail that hits the server for me), but not everyone agrees on those numbers. Symantec has observed the overall spam volume increase from an average of 56 percent of all e-mail traffic in 2006 to about 71 percent in 2007, Symantec spokesman David Forstrom told Ars.
Data source: Barracuda Networks
It's hard to say which company's numbers are more accurate—"Different monitors can legitimately get different results," University of Calgary computer science professor John Aycock told us. What's important are overall trends. One thing that everyone agrees on is that spam continues to morph in an attempt to get through filters. Both Symantec and Barracuda say that they have observed an increased use in file attachments in 2007, like PDFs and images, and security software vendor MXSweep says that spammers are also focusing on sending MP3 and Excel spam.
Back in April, IDC predicted that spam would overtake human-issued e-mails in 2007, but this is one prophecy that we would have preferred didn't come true. The trend shows that the 2003 CAN-SPAM Act has done little to thwart spammers from upping the ante, despite suggestions to the contrary. A few charges may have been brought against spammers here and there, but the US government can only do so much when so many spammers are located elsewhere in the world and those in the US are so difficult to prosecute.
While most of the press reporting on games focuses on the more sensational and negative aspects of the hobby, the NPD Group has just released Expanding the Games Market, a report that shows that an increasing number of people see gaming as a viable and fun hobby. The report also notes that while gaming might be a more isolated activity for the "hardcore" market, most gamers use their hobby as a way to connect with their friends and family.
The sample included 5,039 members of the NPD Group's online consumer panel, and the majority of the respondents said that games were a good way to alleviate stress and unwind. Impressive for the gaming industry was that the stress-relieving aspect of the hobby was mostly seen in older gamers ranging from 15 to 65 years of age.
"The new type of game experiences brought to the market over the past several years are succeeding in reaching a broader audience. The challenge for the industry is that consumers are a fickle group, and with the great variety of options pulling at their limited free time, they're going to be easily distracted unless something really compels them to stay with gaming," said NPD analyst Anita Frazier in a statement. "To reach these less-involved consumers, the industry has to work even harder, but doing so can produce great rewards."
Nintendo is one company that has realized, as is evidenced in the disruptive influence of the Wii and DS, two systems that went after a wider, more casual audience with great financial rewards.
NPD's study stands in contrast to the two other reports recently released. Last week's Hill & Knowlton report buried pages of positive data about gaming in order to focus on a headline about regulation. Also last week, the National Institute for Media and the Family video game report card berated retailers for allowing children to purchased M-rated content while ignoring data that showed compliance with R- and unrated movie guidelines were significantly lower.
With the data showing that 63 percent of the US population is playing games—and 30 percent are playing more this year than last year—gaming is on the move. This acceptance of gaming, along with the expansion of demographics that are spending time and money on gaming, has made some in the retail world take note, and it's expected that more stores in the future will begin allocating more shelf space to games and related hardware.
Gaming is already a mass-market hobby, with people of all ages picking up controllers and portables. Unfortunately, the mainstream media does not seem fully aware of how the mainstream itself thinks about gaming. Once that happens, many of the industry's current PR problems are likely to be alleviated.
Microsoft's PlaysForSure has always been a model of how to run a DRM ecosystem: launch a new scheme with logo, convince device makers to sign up, launch your own online music store that uses said ecosystem, drop your music store, launch your own device which uses incompatible DRM, launch new music store with same incompatible DRM, then change branding of ecosystem logo. On second thought, perhaps there's room for improvement here.
Microsoft has just announced a change in the PlaysForSure branding that adds even more confusion to the DRM ecosystem. Instead of looking for the triangular PlaysForSure logo, consumers are now supposed to look for the "Certified for Windows Vista" logo that is used for plenty of other devices.
The obvious problem here is that PlaysForSure has nothing to do with Vista, and has in fact been used for years on XP. While never gaining much traction among music download stores, it has become the DRM of choice for subscription plans. Now, users of those plans who still run XP should look for the "Certified for Windows Vista" logo the next time they purchase a new player. That shouldn't confuse anyone.
PlaysForSure was the answer to Apple's FairPlay, and Microsoft hoped that it would become the de facto standard among music stores that were not iTunes and devices that were not iPods. It largely has, but that doesn't mean it's popular or widely used. Music stores, especially, aren't thrilled about it since it remains incompatible with the iPod, but for years they had little choice. Now, with the resurgence of MP3, they finally have some chance at reaching out to the iPod market.
Microsoft didn't help PlaysForSure by forking it with the Zune (which is Certified for Windows Vista in an entirely different way), producing an almost identical but still incompatible version of the scheme for use with its new music player. This was widely seen as yanking the rug out from its industry partners who had supported a scheme for so long, but Microsoft continues to support PlaysForSure and companies continue to use it. Nokia, for instance, recently announced that its Comes With Music initiative would use PlaysForSure DRM.
By changing the branding to "Certified for Windows Vista," Microsoft is acting on a noble impulse: make things simpler. Devices will have one logo that shows they work with Vista, and consumers won't need to look for a separate DRM logo. Devices should Just Work. But, as we pointed out above, Microsoft's DRM works on more than Vista, and this seems like a change that might have been better made in another year, when Vista uptake rates are higher. On the other hand, since PlaysForSure hasn't meant much to most consumers, there may be little loss to just making the change now.
Okay, so the original Pentium isn't really coming back in 2008, but what is coming looks enough like the Pentium to give some of us Pentium-era CPU buffs déjà vu. I'm talking, of course, about Intel's forthcoming attempts at an embedded processor core that will compete with the likes of ARM and MIPS, a "core" that may actually be a family of cores aimed at different applications. One application is low-cost, low-power mobile devices, and it turns out that the Diamondville/Silverthorne processor that will power these devices is a lot leaner than I thought. And contrary to expectations, it also will lag Core Solo significantly in clock-for-clock performance.
The new details on Diamondville/Silverthorne are buried in the program for the 2008 International Solid State Circuits Conference (ISSCC). As always, the program contains a number of tantalizing nuggets of information that will be further fleshed out in the conference presentations, some of which were dug out by David Kanter over at RWT in a recent article. But I want to zoom in on the following entry:
13.1 A Sub-1W to 2W Low-Power IA Processor for Mobile Internet Devices and Ultra-Mobile PCs in 45nm High-? Metal-Gate CMOS
Intel, Austin, TX
A 47M transistor, 25mm2, sub-2W IA processor designed for mobile internet devices is presented. It features a 2-issue, in-order pipeline with 32KB iL1 and 24KB dL1 caches, integer and floating point execution units, x86 front end, a 512KB L2 cache and a 533MT/s front-side bus. The design is manufactured in 9M 45nm High-? metal-gate CMOS and housed in a 441-ball μFCBGA package.
Unless the "two-issue, in-order pipeline" in question is some kind of derivative of the original Pentium, then my previous skepticism about Intel's "new architecture from the ground up" claims was unwarranted. Even if it isn't really a Pentium derivative, and it probably isn't in any meaningful sense, the fact that it issues two instructions per clock and has an in-order pipeline makes it the spiritual heir to the original Pentium, which was Intel's original x86 two-issue, in-order design.
This also makes it a direct competitor to ARM's embedded processors, which are also two-issue, in-order designs that focus on low power. But in this particular low-transistor-count, sub-2W competition, the x86 ISA does put Intel at a real disadvantage.
Because a huge chunk of Silverthorne's die area is cache, the percentage of total die area that the new processor spends on x86 instruction decoding won't be as high as the original Pentium's 40 percent, but may well be in the double digits. Like the original Pentium, this relatively large number of transistors spent on x86 decode hardware will put Silverthorne at a performance and performance/watt disadvantage compared to a leaner RISC design like ARM, which can spend more die area on performance-enhancing cache. And Silverthorne's lower-cost Diamondville derivative, which will probably be underclocked and/or have less cache than its big brother, will look even worse next to a comparable ARM part.
Given these factors, I think we can safely assume at this point that Silverthorne will be clock-for-clock slower and less efficient than a comparable ARM part, especially on integer-intensive Web and productivity apps. But if the 45nm Silverthorne launches in the 1-2 GHz clockspeed range that Intel claims for it, then it may still be competitive in terms of performance/watt with ARM's 65nm, slower-clocked Cortex A8 (600MHz to 1.1GHz), its closest competitor. In other words, Intel is planning to do to RISC in the embedded space what it has done to RISC everywhere else: steamroll the competition with sheer process muscle.
The revelation that Silverthorne is in-order and two-issue also raises questions about its performance relative to Intel's regular laptop processors. Obviously, Silverthorne won't be anywhere near even a first-generation Pentium M in terms of clock-for-clock performance, though a (more appropriate) performance/watt comparison may put it within striking distance of even a Core Solo. Nonetheless, don't expect Windows to be anything but slow on Silverthorne—you're going want to run a mobile Linux flavor on a Silverthorne-based MID.
Two weeks ago, Panic released CandyBar 3, a major update to its system icon customization app. The new version retired Pixadex, Panic's "iPhoto for icon junkies," and wrapped its organization features into CandyBar to make version 3 a one-stop shop for organizing and applying custom icons and Docks. CandyBar 3 is a pleasure to use for Mac-slinging UI customize-aholics, especially compared to the previous split team of CandyBar + Pixadex, but like any 1.0, it had some quirks. Fortunately, Panic's been in a bug-squashin' mood, and CandyBar 3.1 is now among us.
New in this free update is a "Dock Preview" in the icon collection view (pictured) which—you guessed it—allows you to actually see a custom Dock before you apply it. Why this wasn't included in the initial release is a tad puzzling, but there's no time to ponder that now; other new features include additional customizable System icons, a bonafide Cancel button to clear any changes you don't want to make, and proper migration from Pixadex libraries (another feature that strangely didn't make the 3.0 release, especially considering Pixadex got retired and all). Nearly two dozen other changes and fixes also made it into 3.1's release notes, including some nice polish like a contextual menu to open an icon author's website.
The new version is available via CandyBar's built-in update system, and a Leopard-only demo is available at Panic's CandyBar site. Now that we're revisiting CandyBar after the original announcement though, we have to agree with some of the complaints about upgrade pricing from the original discussion. While CandyBar's full price of $29 is pretty reasonable considering everything it does, the upgrade prices of $24 (for owners of either CandyBar 2 or Pixadex 2) and $19 (for owners of both previous versions) are a bit steep. That said, CandyBar 3 is still a great release that rounds up a lot of functionality in a slick, well-organized UI. For the icon-obsessed, the new CandyBar really is the only way to go.