Powered by Yzesqiao!

AMD 790FX boards steadily improving post-launch

Posted on 02/09/2019 by

AMD's Spider platform launch last month was originally meant to demonstrate the company's decisive return to competitive status vis-á-vis both Intel and NVIDIA. As far as NVIDIA is concerned, the Spider launch can be considered a success; AMD's new HD 3850 and HD 3870 are both solid products. We've also covered the current status of Phenom and the timeframe in which we can expect solutions to the processor's documented problems. The third point of AMD's platform triangle is the 790FX chipset (and derivatives) that launched on November 19. Initial reports on the chipset weren't good. Multiple reviewers documented the problems they had simply stabilizing their launch boards sufficiently enough to test them. HangZhou Night Net

Several weeks later, though, things appear to be changing for the better. Motherboard manufacturers appear to be taking early bug reports seriously. Gigabyte has already released three BIOS updates for its GA-MA790FX-DQ6 (the last one currently available contains the TLB L3 erratum fix) and two for the GA-MA790FX-DS5. Asus hasn't made any BIOS updates for the M3A32-MVP Deluxe or the M3A available, but we've gotten updates from AMD containing the latest BIOS versions the company is currently testing. MSI also released a new BIOS just before product launch that fixes several issues.

So far, all of the manufacturers that have launched Phenom boards appear to be committed to supporting and improving their respective platforms. That's an encouraging sign for AMD users who might be interested in a 790 board either now or at some point in the future. AMD has had a fair bit of success in the low-power market with its 690G chipset; hopefully the upcoming 790-based integrated chipset can continue this trend.

Judge: eBay can use “Buy It Now,” still owes $30 million

Posted on 02/09/2019 by

A US district judge has penned the latest chapter in eBay's long-standing patent infringement battle with MercExchange LLC. Since 2003, eBay has been fighting a jury's decision that its Buy It Now feature infringes on two patents held by MercExchange, originally an auction site itself over a decade ago. eBay fought off an injunction earlier this year, but the judge's ruling today orders eBay to pay $30 million to MercExchange, LLC. HangZhou Night Net

This patent dispute hails all the way back to 2001 when licensing talks broke down between eBay and Tom Woolston, founder of MercExchange. Woolston promptly filed suit against eBay, fighting for an injunction to stop the service from using the feature. A US District Court sided with MercExchange in 2003, but eBay's appeals eventually escalated the case up to the Supreme Court, where the focus shifted and subsequently garnered the attention ofIP lawyers in everythingfrom the software industry to thepharmaceutical business.

At issue before the Supreme Court was whetherpatent infringement should necessitate injunctions to stop companies from distributing their products. This meant much higher stakes were on the table.Todd Dickinson, GE's vice president of intellectual property, even called it "the most important commercial law case before the Supreme Court so far this century."

The Supreme Court sided with eBay in 2006, ruling that patent infringement does not automatically warrant a permanent injunction. In July of this year, Judge Jerome B. Friedman noted the fact that MercExchange hasn't been operating as an auction site for quite some time,along withthe company's obvious attempts to use its patents as a revenue-generating crutch. With that in mind, Friedman dismissed the possibility of an injunction, ruling that eBay could continue using its Buy It Now feature.

The greater industry win from eBay's battle with MercExchange has been the establishment of injunction boundaries in patent suits. However, the fact that the battle took so long and that eBay still owes $30 million is an ugly reminder of the state of software patents. Naturally, eBay plans to appeal the ruling.

Nokia releases Internet Tablet development manual

Posted on 02/09/2019 by

Nokia has officially released the Maemo 4 training manual, extensive documentation for third-party Maemo application developers. The manual, which consists of three distinct guides, is intended to serve as a starting point for programmers who want to create software for Nokia Internet Tablet devices. The documentation is all distributed under the Creative Commons Attribution-Share Alike license and the code examples are distributed under the MIT license. HangZhou Night Net

"We hope that maemo training courses provide an efficient overview of the tools and methodologies needed when developing applications and platform services to the maemo platform," said the Maemo team in an announcement. "Courses have been done to be as hands-on style as possible. In practice this means that there are a lot of exercises and examples how to write simple GUI applications and platform services and how to integrate them into the maemo platform and the packaging framework."

The Getting Started guide provides a high-level overview of the system, describes how to install the Maemo SDK, and provides a concise tutorial that describes how to create a very simple Hello World application. The Application Development guide introduces GTK development concepts, explains how to use Autotools, describes various GNOME support libraries like Gconf, describes how to integrate with the Maemo Application Framework, and includes a packaging tutorial. The Platform Development guide explains how to leverage components of the Maemo platform infrastructure like D-Bus and LibOSSO.

The training manual is an excellent resource that is well-written and easy to navigate. Everyone who is interested in Maemo development should take a look. Additional documentation, like the Maemo 4 porting guide and API reference, is available at the Maemo development site.

New “white spaces” group to lobby for wireless broadband

Posted on 02/09/2019 by

The idea of using the white spaces between digital TV channels for wireless broadband has been somewhat lost in the build-up to the 700MHz wireless spectrum auction next month. The newly-formed Wireless Innovation Alliance (WIA) hopes to change that with a PR campaign directed at the Federal Communications Commission and policy makers. Its goal is a set of "clear, reasonable regulations" that will enable wireless broadband deployments in the space between DTV channels. HangZhou Night Net

In October 2006, the FCC voted to open up the white spaces to use by unlicensed consumer electronics. A few months later, a group of eight companies calling itself the White Spaces Coalition submitted a prototype device to the FCC for testing in March. The first device malfunctioned and got a failing grade from the FCC; the Commission neglected to test another device submitted in May.

In a subsequent filing, the White Spaces Coalition argued that, since the second prototype device passed its tests with 100 percent accuracy, the FCC should direct its attention towards crafting rules to govern how, exactly, the empty digital TV spectrum should be used for wireless broadband.

There is significant opposition to using white spaces in the digital TV spectrum for wireless broadband, most of it coming from broadcasters. This past September, the National Association of Broadcasters launched a PR campaign of its own, calling the White Spaces Coalition's proposals "wrongheaded." The NAB argues that wireless broadband in the white spaces will inevitably result in interference and released a statement today critical of the WIA. "It is unfortunate that Microsoft and Google continue to try to muscle their way through Washington in support of a technology that simply does not work," said NAB executive VP Dennis Wharton. "By playing Russian Roulette with digital television, Microsoft and Google would completely undermine the historic public-private DTV partnership that broadcasters embraced to ensure America's ongoing leadership in innovation."

The Wireless Innovation Alliance hopes to counteract the NAB's efforts with an educational campaign of its own and by working closely with policy makers. The group already has backers in Congress, including Rep. Marsha Blackburn (R-TN) and Rep. Jay Inslee (D-WA). "I'm proud to support the WIA coalition, alongside my colleagues in Congress who also back the use of TV white spaces," said Rep. Inslee. "I look forward to helping the coalition's education campaign."

Members of the alliance include a handful of familiar faces from the White Spaces Coalition, including Dell, Google, HP, and Microsoft, as well as Public Knowledge, FreePress, EDUCAUSE, and the National Hispanic Media Coalition.

In addition to participating in the white spaces broadband effort, Google is also planning a bid on the 700MHz spectrum. Between the spectrum auction and the white spaces proposals, hopefully broadband in the US will become cheaper, more competitive, and universally available over the next few years.

Google Toolbar 5 intros pref syncing, Gadgets to IE users

Posted on 02/09/2019 by

Google Toolbar fans will be happy to hear that the company released an update to its toolbar for Internet Explorer today. Google Toolbar 5 for IE adds Gadgets support, settings sync, and address correction (among other things) to the browser tool that should make it even more convenient for users to retrieve information on their Internet travels. HangZhou Night Net

One of the more notable new features in Toolbar 5 is the ability to add and use Google Gadgets directly from the toolbar. Almost anything that someone has written for Google Gadgets, which can be used as part of your Google Desktop on both the Mac and Windows, can now be added as a button and used as an extension of the toolbar. I added the Google Maps gadget to mine, which functions differently than the Google Maps button that comes by default with Toolbar. Instead of merely representing a link to Google Maps' web page, the Maps gadget through the toolbar lets me search for things right there without having to navigate to a new page.

My mini Maps results window

Google has also added a feature that allows the Toolbar to sync its settings with a server and be accessed from multiple computers. No longer do you have to maintain one set of Toolbar settings on your desktop and try to mirror that on your laptop—just set it once, sign in on both computers, and Toolbar will take care of the rest. That's certainly helpful for many of us geeks, who are used to switching between multiple computers at work, home, and travel.

Toolbar has also gained Google Notebook support, which allows users to store links, clips, and other information from the Web in a button-accessible spot. If you're not into simply bookmarking sites or you want to store more than just a link (such as notes to go along with it), this can be handy.

Add clippings and notes to the Notebook

And for those who have clumsy fingers, Toolbar can now suggest alternatives for common navigation errors. Instead of directing you to a 404 or DNS error for typing "youtube.co," the results page will suggest alternatives in an attempt to help you get where you're going. This isn't exactly a mind-blowing new feature, but it's nice nonetheless. Finally, Toolbar's AutoFill feature has gotten a bit of a boost by allowing users to enter and manage multiple profiles. For example, if you have a company credit card with a certain billing address, you can keep that separate from your personal credit card and address info. Google says that the tool has been tweaked to be more accurate in detecting where information should be placed, too, although in my tests, AutoFill still didn't work at all (or correctly) at several sites.

The changes may not send hardcore toolbar haters to Google's servers en masse, but those who make constant use of Google Toolbar and already use IE will find it to be a welcome update. Now, if only Google Toolbar for Firefox would catch up.

When consoles and PCs collide: Unreal Tournament 3 reviewed

Posted on 08/08/2019 by

When men were men and mice had balls

Unreal Tournament 3
Developer: Epic
Publisher: Midway
Platform: PC, PlayStation 3, (Xbox 360 in the future)
Price: $49.99 PC (Shop.Ars) $59.99 PS3 (Shop.Ars)
Rating: M (Teen) HangZhou Night Net

Unreal Tournament 3 feels like an anachronism. Back in the glory days of competitive first-person shooters, it was all about speed, reflexes, and having the best route through each map. Get the health, get the best weapons, and be able to nail a moving target with a headshot. The days of Quake and Unreal were great for those of us with gaming ADD; I used to put on headphones and blast my favorite metal songs while blowing away my friends. I could get all out my aggression in about a half an hour playing those games.

Then something happened. Things slowed down. Vehicles were put in. People were expected to play in large teams, hang back, use their "classes." The change that games like Battlefield 1942 brought to multiplayer games was an interesting shake-up, but then everything went that way. It's been a while since we've had a game that was all reflex, that was a mad dash to your favorite weapon, where you could download a new map or gametype every day if you wanted to.

I'm too awesome to notice that Darkwalker behind me.

Unreal Tournament 3, even with its odd name (it comes after Unreal Tournament and apparently combines Unreal Tournament 2003 and Unreal Tournament 2004 as Unreal Tournament 2, while leaving the single-player games aside), doesn't really have to do much as a game for it to get a hearty "buy" recommendation. With any of the Unreal games you're basically buying the latest iteration of the Unreal engine, with a few examples of what it can do. Sure, the out-of-the-box experience usually delivers some amazing thrills, but the real value is what amateur coders and enthusiasts will do with the game using the robust editors and tools. You'll be playing free, high-quality Unreal Tournament 3 content for at least the next two years; if the game came with an empty box and an engine license, it would still be a good deal.

But we can't review based on potential, even ifamazing stuff is already online. We're going to look at what the game is like now, as well as look at how the game is blazing its own trail on consoles. On the PC, this is an incredibly strong, almost old-school shooter. On the console, it's breaking down barriers.

Everyone moves faster, everything is slower

Ironically enough, those of us who spent way too much time playing Unreal Tournament 2004 will have a harder time getting a feel for UT3 than newcomers. The flack cannon isn't nearly as powerful or as fast. Neither is the rocket launcher. The minigun feels more powerful. Watching how easily nimble opponents can double-tap the A or D button to jump out of the way of the slower rockets while pumping you full of enforcer rounds is a humbling experience. It's still a twitch fest, but spamming rockets isn'tthe easyroad to success.

You'll have plenty of time to figure all this out in the campaign mode, a single-player experience that takes you through the new game modes and maps while explaining what's going on. While this is a good way for rookies and veterans alike to get proficient in the game, the dogged determination to explain why a war is being fought like a first-person shooter is campy at first, then just mildly annoying. The "Field Lattice Generator" that's so important? Yeah, it's the flag. FLaG. Get it? I don't know why the game can't just calm down and be a game, but it's at least amusing in its earnestness.

Still, it's worth it to go through at least a few of the missions in the campaign mode just to figure out what's up. To make the process a little bit more tolerable, you can play with others against the PC-controlled bots in a kind of single-player co-op mode. This can be a good way to learn some tactics while not at the mercy of another human team, and it's a very welcome feature, especially considering how the teammate AIfalls apart in objective-based maps.

At first,Unreal Tournament 3might seemjust a cosmetic upgrade to an already old formula, but once you start to really dig in, you get a sense for how much things have changed and how different the playing field now is.

AMD’s Barcelona, Phenom suffer early setbacks

Posted on 08/08/2019 by

It has been over four months since AMD launched the quad-core Barcelona server processor that was to turn the company around, and a little over three weeks since the launch of Phenom, the desktop derivative of Barcelona. So how is the new quad-core part doing? Not so well, thanks to an odd error that has limited the clockspeeds of the new chips at precisely the moment when AMD needs all the GHz it can get to keep up with Intel. HangZhou Night Net

Those of you who've been following the TLB erratum saga on Kit will know much of the following story, but if you haven't been keeping track, here's a recap to bring you up to speed.

Slow launch speeds point to design defect

Phenom came out of the gate last month with an underwhelming launch performance that left it looking like an also-ran against Intel's Kentsfield-based 2.4GHz Q6600. With Yorkfield-based quad-core parts already available, AMD badly needed to establish its quad-core part as superior to the Q6600 and failed to do so with Phenom. At 2.3GHz, Phenom still trails the Q6600, and while bumping the chip up to 2.4GHz does shrink Intel's lead, it does not eliminate it.

Shortly after launch day, it emerged that Phenom's slow launch speed was merely a symptom of a deeper problem, and one that also affects Barcelona. AMD announced the existence of an error in Barcelona's and Phenom's TLB (translation lookaside buffers), and that revelation has tainted the current version of the core. Despite AMD's promise that the error is extremely rare, consumers have historically reacted poorly to hardware that they perceive as defective. In this case, the defect might never appear over the entire lifetime of a desktop chip like Phenom—but it's still enough to push away potential buyers.

Though the problem may be rare for Phenom customers, the problem is apparently more acute for Barcelona customers. Server chips generally see more intense cache usage, so Barcelona users are more likely to see TLB-related problems than Phenom users. This being the case, AMD has allegedly stopped shipping Barcelona to customers, although there's some dispute over whether or not there has really been a change in the company's shipping patterns.

Fixing the problem

AMD has created a solution to the flaw that Phenom board manufacturers are required to include, but the fix itself knocks anywhere from 5 percent to 20 percent of performance off the chip. That's not good news for a processor that's already lagging behind its closest Intel counterpart, and it will turn off potential Phenom customers even further.

AMD fixed the TLB erratum in the upcoming "B3" version of Phenom and Barcelona. Unfortunately, that core revision isn't expected until mid-to-late Q1, and its not clear how willing AMD will be to scale the K10 architecture before the new core arrives. Currently, AMD's roadmap projects the arrival of a 3GHz Phenom in the second quarter of 2008, but that's a very slow ramp considering that Intel's quad-core Yorkfield is already available at 3GHz.

AMD did the right thing by being upfront and honest about K10's TLB erratum, and that may help the company's sales in the longer term by generating consumer goodwill. Regardless of whether or not this occurs, however, Sunnyvale's next few quarters aren't going to be pretty. Even once the processor's TLB issue is corrected, the new quad-core line's projected scaling isn't aggressive enough to close the gap between itself and Intel's Penryn offerings.

Some good news on the desktop front

While Barcelona's future is still looking murky, Phenom has a few bright spots amidst the current clouds. The immediate good news is that chips and boards are available in the retail channel. NewEgg currently carries both the Phenom X4 9500 (2.2GHz) and the Phenom X4 9600 (2.3GHz) at prices of $239.99 and $275.99 respectively. Current Phenom parts obviously aren't going to break any sales records, but AMD is at least shipping the part. The company is also confident that end-users will never experience the TLB bug—so much so, in fact, that it intends to offer end-users the chance to turn the required BIOS fix on and off at will. Future versions of AMD Overdrive will include such a toggle, thus allowing enthusiasts to choose for themselves which mode to run in. AMD plans to announce a multiplier-unlocked "Black Edition" Phenom in the near future. The chip will run at the same 2.3GHz as the current 9600 Phenom X4, and will retail for the same $275.99. There's no word on what kind of ceiling these chips might have—launch processors tended to top out at 2.8GHz or so, but AMD has stated that their current K8 65nm production is running very well. That's possibly indicative of a general improvement across all AMD product lines, but it may also apply specifically to K8 production.

With Phenom unlikely to catch Core 2 Duo in terms of raw performance, AMD will have to find other areas in which to compete with Intel. Strong execution from ATI, hopefully culminating in the company's return to the high-end market, would certainly help, as would further refinements of the 790 chipset. The company has also given itself some room to cut prices on Phenom and could could potentially squeeze further performance or lower power consumption out of K8. There's also the introduction of Tri-core and native dual-core products yet to come, both of which could potentially improve AMD's outlook.

Bali climate meeting update: limits out, mitigation aid in (updated)

Posted on 08/08/2019 by

The Bali negotiations, meant to lay the groundwork for a successor to the Kyoto Protocols, are nearing their end, and the picture of what will emerge has become more clear. Despite an impassioned plea by the Secretary-General of the UN and strong backing from the nations of the European Union, the US and China have succeeded in preventing any agreements on hard targets for future carbon reductions. Progress has been made, however, in addressing some of the concerns of the developing world. HangZhou Night Net

Although the Bali negotiations began last week, many high-level officials are arriving now, after the preliminary work has been completed. Ban Kai-Moon, Secretary-General of the UN, spoke to the gathered delegates earlier today, calling climate change, "the moral challenge of our generation." He presented inaction as a threat to future inhabitants of the planet: "Succeeding generations depend on us," he said. "We cannot rob our children of their future."

To a large extent, however, chances for any definitive action had already passed by the time he spoke. The US delegation entered these negotiations with the intention of blocking any hard limits on carbon emissions, and they carried the day over the objections of the European Union and most developing nations. This position is in keeping with the Bush administration's promotion of long-term aspirational goals, rather than strict limits.

When talking to reporters after his speech, Ban apparently accepted that setting limits at this stage, "may be too ambitions," according to the Associated Press. "Practically speaking," he said, "this will have to be negotiated down the road."

With no movement on that front, attention has shifted towards a second major goal of the Bali meeting, aid to the developing world. The New York Times is reporting that more significant progress has been made on that front. Kyoto had set up a mechanism for funding adaptation work in the developing world, but the effort was chronically underfunded and developing nations have found it difficult to get projects approved. The new agreement streamlines the approval process and levies a tax on carbon trading markets to fund the program. Although this is still unlikely to cover anywhere close to the predicted costs the developing world will face due to climate change, the Times suggests that this agreement is being viewed by those nations as a sign that their concerns are at least being treated seriously.

Given the stated positions of major participants going into these talks, these results are fairly predictable. Much of the hard work involved in forging a binding agreement has been put off to the future. Perhaps the most significant consequence of the Bali negotiations is that any agreement on binding emissions limits is unlikely to be completed prior to the end of the Bush administration. Many of the presidential candidates appear far more willing to consider mandated limits on carbon emissions, raising the possibility that the negotiations that have been put off to the future will be less contentious when that future arrives.

UPDATE: The BBC is reporting that the European Union is threatening to retaliate for the US' intransigence at Bali by boycotting future climate talks hosted by the White House. These talks are intended to set the aspirational goals for emissions curbs favored by the US; the absence of major emitters would make them largely pointless. The report suggests that this threat may simply be an EU negotiating ploy, but many have interpreted the Washington talks as little more than an attempt to undermine Bali in the first place.

A first look at KDE 4.0 release candidate 2

Posted on 08/08/2019 by

HangZhou Night Net

The KDE development community has been working on KDE 4, a major KDE revision that brings in numerous new technologies, andthe official releaseis scheduled for next month. My colleague Troy has provided in-depth overviews of previous betas, but today we're going to take a lookat the second KDE 4 release candidate, which was made available for download earlier this week.

For my tests, I used the KDE 4 RC 2 Live CD image proved by the KDE community. The Live CD is based on OpenSUSE and provides easy access to a complete KDE 4 environment.

Although the release candidate has many rough edges and is significantly lacking in polish, the underlying technologies are all in place and deliver some very impressive functionality. KDE 4 offers some unique architectural advantages over its predecessor, including a completely redesigned desktop and panel system called Plasma, an improved version of the KWin window manager with advanced compositing features, a new modular multimedia framework called Phonon, and a sophisticated hardware API called Solid.

KDE 4 uses a completely new visual style and icon set called Oxygen. The Oxygen widget theme uses a lot of subtle gradients, light shadows, round edges, and background highlighting. Generally,the Oxygen style manages to be attractive without becoming a distraction, butthere are still some placeswhere additional work is needed. This is most evident in a handful of preference dialogs where elements are cut off or notrendered correctly. In most of the major desktop applications, Oxygen looks nice. I was also impressed with the flexibility of Oxygen. The appearance preferences dialog gives users extensive control over the colors of specific elements. Oxygen looks great with bothdark and light color schemes.

In current versions of KDE, Konqueror is both a web browser and a file manager. In KDE 4, the focus for Konqueror has shifted towards browsing, whileDolphin has become the default file manager. When I first looked at Dolphin back in April, I really liked what I saw. The version of Dolphin included in the release candidate still serves up all the same great features and adds a few more nice ones, like a new column view that is similar to the Mac OS X Finder. Dolphin is powerful, intuitive, and a pleasure to use.

I tested several other applications as well, like Amarok 2 and KWord 2, both of which are impressive despite some rough edges. KWord 2's somewhat unusual user interface isintriguing, but clearly requires more polish to be truly useful. KWord 2 pushes most of the formatting features into a vertical dock that runs along the side of the window, which seemspractical for users with widescreen monitors. The Amarok 2 user interface has a lot of nice little aesthetic flourishes and iseasily the most polished of the KDE 4 applications.

Currently, the most significant weaknesses in KDE 4 arein the Plasma desktop layer. The Plasma components used for the desktop and the panel areundercooked. One of the primary features of Plasma is support for desktop-embedded widgets—called plasmoids—that are similar in nature to SuperKaramba widgets, Windows Sidebar gadgets, or Mac OS X Dashboard widgets. I tested several of the plasmoids that ship with the release candidate, including a clock, battery meter, calculator, and dictionary. On two occasions, adding a plasmoid to the desktop caused the screen to go black and the computer to become unresponsive, requiring me to forcibly kill and restart Xorg in order to continue with my tests. The plasmoids themselves are also somewhat buggy.

The panel at the bottom of the screen, which is also provided by Plasma, includes a menu button, task list, and notification area. I was unable to find any way to configure the panel, which is significantly less functional than the panel in the current version of KDE.

The second KDE 4 release candidate illuminates the extent to which KDE 4 has matured since the earlier betas, but a massive infusion of debugging and polish is needed before the release next month. Heavy development on KDE 4 will obviously continue after the KDE 4.0 release, so whatever pieces are still missing are sure to befilled in eventually. Some critics point to the deficiencies of KDE 4 and argue that drastic reinvention of basic desktop components might not have been a good idea. After experiencing KDE 4 myself, I have to disagree.

Transitions are always hard, butwhen the dust settles, a clean break between versions and an opportunity to introduce some innovative new ideasshouldlead to a stronger user experience. After years of development, unnecessary cruft builds up and things tend to get disorganized. The KDE 4 transition, though it will definitely be rocky at first, givesdevelopers the ability to cut awaythe cruft and reorganize codein a manner that makes the whole environment more future-proof and easier to maintain.

Evidence of the advantages might not be readily apparent to end users yet, but there are plenty ofsweet improvementsunder the hood in KDE 4. Developer-oriented changes like the much-needed migration from Autotools to CMake, the shift from Qt 3 to Qt 4, or the move from DCOP to D-Bus, for instance, all offer very real advantages.

My experience with KDE 4 revealed a set of hairy,nasty warts thatneed to be resolved, but it also shonesome light on some impressive improvements. Although I'm skeptical that all of the problems with the release candidate can be fixed in only one month, a rough first releaseseems a small price to pay for the significant long-term advantages offered by the transition.

Microsoft expands XNA development platform with Live functionality

Posted on 08/08/2019 by

Running with the success of XNA, Microsoft's open-ended development platform for use with PC and Xbox 360 game development, the company has moved the suite to the next phase with the launch of the XNA Game Studio 2.0. The new version allows developers to leverage the proprietary technologies of Xbox Live and Games for Windows Live in their creations. The new software is now available for download from the XNA Creator's Club web site. HangZhou Night Net

The headlining feature of the latest build is the addition of networking support to the XNA Framework API for both Xbox Live and Games for Windows Live. The cross-platform networking protocol enables "developers to create multiplayer games in which players on separate machines can compete with each other or play cooperatively."

Also included in the update are a variety of equally significant features, including a new high-level "Game" application model and the opening of the "Guide" class to developers. As Xbox 360 users will know, the Guide is part of the underlying Xbox 360 operating system which allows manipulation of the device, including text entry, storage drive selection, and more. The updated API also includes new methods for detecting alternative input mediums, including the Xbox 360 chatpad.

The XNA platform launched publicly in March of 2006 and was an unusual step forward for the company. The typically closed-platform nature of console development was opened slightly for independent and hobbyist developers who were interesting in developing for the increasingly popular platform in a cost-effective and relatively simple way, and cross-platform development was made easier for companies looking to release Xbox 360 and PC software as cost-effectively as possible.

The platform itself leverages a variety of other Microsoft technologies, including Visual Studio and subsequently C#. In fact, the only real downside (from an end-user perspective, anyway) is the $99 annual fee required to have access to the developer network: the one stipulation that prevents the service from being truly open to the masses.

Nevertheless, the platform has been enjoying success in both real-world and academic applications, and this latest advancement will only serve to further the proliferation of Microsoft's development platform.