Monthly Archives: August 2019

When consoles and PCs collide: Unreal Tournament 3 reviewed

Posted on 08/08/2019 by

When men were men and mice had balls

Unreal Tournament 3
Developer: Epic
Publisher: Midway
Platform: PC, PlayStation 3, (Xbox 360 in the future)
Price: $49.99 PC (Shop.Ars) $59.99 PS3 (Shop.Ars)
Rating: M (Teen) HangZhou Night Net

Unreal Tournament 3 feels like an anachronism. Back in the glory days of competitive first-person shooters, it was all about speed, reflexes, and having the best route through each map. Get the health, get the best weapons, and be able to nail a moving target with a headshot. The days of Quake and Unreal were great for those of us with gaming ADD; I used to put on headphones and blast my favorite metal songs while blowing away my friends. I could get all out my aggression in about a half an hour playing those games.

Then something happened. Things slowed down. Vehicles were put in. People were expected to play in large teams, hang back, use their "classes." The change that games like Battlefield 1942 brought to multiplayer games was an interesting shake-up, but then everything went that way. It's been a while since we've had a game that was all reflex, that was a mad dash to your favorite weapon, where you could download a new map or gametype every day if you wanted to.

I'm too awesome to notice that Darkwalker behind me.

Unreal Tournament 3, even with its odd name (it comes after Unreal Tournament and apparently combines Unreal Tournament 2003 and Unreal Tournament 2004 as Unreal Tournament 2, while leaving the single-player games aside), doesn't really have to do much as a game for it to get a hearty "buy" recommendation. With any of the Unreal games you're basically buying the latest iteration of the Unreal engine, with a few examples of what it can do. Sure, the out-of-the-box experience usually delivers some amazing thrills, but the real value is what amateur coders and enthusiasts will do with the game using the robust editors and tools. You'll be playing free, high-quality Unreal Tournament 3 content for at least the next two years; if the game came with an empty box and an engine license, it would still be a good deal.

But we can't review based on potential, even ifamazing stuff is already online. We're going to look at what the game is like now, as well as look at how the game is blazing its own trail on consoles. On the PC, this is an incredibly strong, almost old-school shooter. On the console, it's breaking down barriers.

Everyone moves faster, everything is slower

Ironically enough, those of us who spent way too much time playing Unreal Tournament 2004 will have a harder time getting a feel for UT3 than newcomers. The flack cannon isn't nearly as powerful or as fast. Neither is the rocket launcher. The minigun feels more powerful. Watching how easily nimble opponents can double-tap the A or D button to jump out of the way of the slower rockets while pumping you full of enforcer rounds is a humbling experience. It's still a twitch fest, but spamming rockets isn'tthe easyroad to success.

You'll have plenty of time to figure all this out in the campaign mode, a single-player experience that takes you through the new game modes and maps while explaining what's going on. While this is a good way for rookies and veterans alike to get proficient in the game, the dogged determination to explain why a war is being fought like a first-person shooter is campy at first, then just mildly annoying. The "Field Lattice Generator" that's so important? Yeah, it's the flag. FLaG. Get it? I don't know why the game can't just calm down and be a game, but it's at least amusing in its earnestness.

Still, it's worth it to go through at least a few of the missions in the campaign mode just to figure out what's up. To make the process a little bit more tolerable, you can play with others against the PC-controlled bots in a kind of single-player co-op mode. This can be a good way to learn some tactics while not at the mercy of another human team, and it's a very welcome feature, especially considering how the teammate AIfalls apart in objective-based maps.

At first,Unreal Tournament 3might seemjust a cosmetic upgrade to an already old formula, but once you start to really dig in, you get a sense for how much things have changed and how different the playing field now is.

AMD’s Barcelona, Phenom suffer early setbacks

Posted on 08/08/2019 by

It has been over four months since AMD launched the quad-core Barcelona server processor that was to turn the company around, and a little over three weeks since the launch of Phenom, the desktop derivative of Barcelona. So how is the new quad-core part doing? Not so well, thanks to an odd error that has limited the clockspeeds of the new chips at precisely the moment when AMD needs all the GHz it can get to keep up with Intel. HangZhou Night Net

Those of you who've been following the TLB erratum saga on Kit will know much of the following story, but if you haven't been keeping track, here's a recap to bring you up to speed.

Slow launch speeds point to design defect

Phenom came out of the gate last month with an underwhelming launch performance that left it looking like an also-ran against Intel's Kentsfield-based 2.4GHz Q6600. With Yorkfield-based quad-core parts already available, AMD badly needed to establish its quad-core part as superior to the Q6600 and failed to do so with Phenom. At 2.3GHz, Phenom still trails the Q6600, and while bumping the chip up to 2.4GHz does shrink Intel's lead, it does not eliminate it.

Shortly after launch day, it emerged that Phenom's slow launch speed was merely a symptom of a deeper problem, and one that also affects Barcelona. AMD announced the existence of an error in Barcelona's and Phenom's TLB (translation lookaside buffers), and that revelation has tainted the current version of the core. Despite AMD's promise that the error is extremely rare, consumers have historically reacted poorly to hardware that they perceive as defective. In this case, the defect might never appear over the entire lifetime of a desktop chip like Phenom—but it's still enough to push away potential buyers.

Though the problem may be rare for Phenom customers, the problem is apparently more acute for Barcelona customers. Server chips generally see more intense cache usage, so Barcelona users are more likely to see TLB-related problems than Phenom users. This being the case, AMD has allegedly stopped shipping Barcelona to customers, although there's some dispute over whether or not there has really been a change in the company's shipping patterns.

Fixing the problem

AMD has created a solution to the flaw that Phenom board manufacturers are required to include, but the fix itself knocks anywhere from 5 percent to 20 percent of performance off the chip. That's not good news for a processor that's already lagging behind its closest Intel counterpart, and it will turn off potential Phenom customers even further.

AMD fixed the TLB erratum in the upcoming "B3" version of Phenom and Barcelona. Unfortunately, that core revision isn't expected until mid-to-late Q1, and its not clear how willing AMD will be to scale the K10 architecture before the new core arrives. Currently, AMD's roadmap projects the arrival of a 3GHz Phenom in the second quarter of 2008, but that's a very slow ramp considering that Intel's quad-core Yorkfield is already available at 3GHz.

AMD did the right thing by being upfront and honest about K10's TLB erratum, and that may help the company's sales in the longer term by generating consumer goodwill. Regardless of whether or not this occurs, however, Sunnyvale's next few quarters aren't going to be pretty. Even once the processor's TLB issue is corrected, the new quad-core line's projected scaling isn't aggressive enough to close the gap between itself and Intel's Penryn offerings.

Some good news on the desktop front

While Barcelona's future is still looking murky, Phenom has a few bright spots amidst the current clouds. The immediate good news is that chips and boards are available in the retail channel. NewEgg currently carries both the Phenom X4 9500 (2.2GHz) and the Phenom X4 9600 (2.3GHz) at prices of $239.99 and $275.99 respectively. Current Phenom parts obviously aren't going to break any sales records, but AMD is at least shipping the part. The company is also confident that end-users will never experience the TLB bug—so much so, in fact, that it intends to offer end-users the chance to turn the required BIOS fix on and off at will. Future versions of AMD Overdrive will include such a toggle, thus allowing enthusiasts to choose for themselves which mode to run in. AMD plans to announce a multiplier-unlocked "Black Edition" Phenom in the near future. The chip will run at the same 2.3GHz as the current 9600 Phenom X4, and will retail for the same $275.99. There's no word on what kind of ceiling these chips might have—launch processors tended to top out at 2.8GHz or so, but AMD has stated that their current K8 65nm production is running very well. That's possibly indicative of a general improvement across all AMD product lines, but it may also apply specifically to K8 production.

With Phenom unlikely to catch Core 2 Duo in terms of raw performance, AMD will have to find other areas in which to compete with Intel. Strong execution from ATI, hopefully culminating in the company's return to the high-end market, would certainly help, as would further refinements of the 790 chipset. The company has also given itself some room to cut prices on Phenom and could could potentially squeeze further performance or lower power consumption out of K8. There's also the introduction of Tri-core and native dual-core products yet to come, both of which could potentially improve AMD's outlook.

Bali climate meeting update: limits out, mitigation aid in (updated)

Posted on 08/08/2019 by

The Bali negotiations, meant to lay the groundwork for a successor to the Kyoto Protocols, are nearing their end, and the picture of what will emerge has become more clear. Despite an impassioned plea by the Secretary-General of the UN and strong backing from the nations of the European Union, the US and China have succeeded in preventing any agreements on hard targets for future carbon reductions. Progress has been made, however, in addressing some of the concerns of the developing world. HangZhou Night Net

Although the Bali negotiations began last week, many high-level officials are arriving now, after the preliminary work has been completed. Ban Kai-Moon, Secretary-General of the UN, spoke to the gathered delegates earlier today, calling climate change, "the moral challenge of our generation." He presented inaction as a threat to future inhabitants of the planet: "Succeeding generations depend on us," he said. "We cannot rob our children of their future."

To a large extent, however, chances for any definitive action had already passed by the time he spoke. The US delegation entered these negotiations with the intention of blocking any hard limits on carbon emissions, and they carried the day over the objections of the European Union and most developing nations. This position is in keeping with the Bush administration's promotion of long-term aspirational goals, rather than strict limits.

When talking to reporters after his speech, Ban apparently accepted that setting limits at this stage, "may be too ambitions," according to the Associated Press. "Practically speaking," he said, "this will have to be negotiated down the road."

With no movement on that front, attention has shifted towards a second major goal of the Bali meeting, aid to the developing world. The New York Times is reporting that more significant progress has been made on that front. Kyoto had set up a mechanism for funding adaptation work in the developing world, but the effort was chronically underfunded and developing nations have found it difficult to get projects approved. The new agreement streamlines the approval process and levies a tax on carbon trading markets to fund the program. Although this is still unlikely to cover anywhere close to the predicted costs the developing world will face due to climate change, the Times suggests that this agreement is being viewed by those nations as a sign that their concerns are at least being treated seriously.

Given the stated positions of major participants going into these talks, these results are fairly predictable. Much of the hard work involved in forging a binding agreement has been put off to the future. Perhaps the most significant consequence of the Bali negotiations is that any agreement on binding emissions limits is unlikely to be completed prior to the end of the Bush administration. Many of the presidential candidates appear far more willing to consider mandated limits on carbon emissions, raising the possibility that the negotiations that have been put off to the future will be less contentious when that future arrives.

UPDATE: The BBC is reporting that the European Union is threatening to retaliate for the US' intransigence at Bali by boycotting future climate talks hosted by the White House. These talks are intended to set the aspirational goals for emissions curbs favored by the US; the absence of major emitters would make them largely pointless. The report suggests that this threat may simply be an EU negotiating ploy, but many have interpreted the Washington talks as little more than an attempt to undermine Bali in the first place.

A first look at KDE 4.0 release candidate 2

Posted on 08/08/2019 by

HangZhou Night Net

The KDE development community has been working on KDE 4, a major KDE revision that brings in numerous new technologies, andthe official releaseis scheduled for next month. My colleague Troy has provided in-depth overviews of previous betas, but today we're going to take a lookat the second KDE 4 release candidate, which was made available for download earlier this week.

For my tests, I used the KDE 4 RC 2 Live CD image proved by the KDE community. The Live CD is based on OpenSUSE and provides easy access to a complete KDE 4 environment.

Although the release candidate has many rough edges and is significantly lacking in polish, the underlying technologies are all in place and deliver some very impressive functionality. KDE 4 offers some unique architectural advantages over its predecessor, including a completely redesigned desktop and panel system called Plasma, an improved version of the KWin window manager with advanced compositing features, a new modular multimedia framework called Phonon, and a sophisticated hardware API called Solid.

KDE 4 uses a completely new visual style and icon set called Oxygen. The Oxygen widget theme uses a lot of subtle gradients, light shadows, round edges, and background highlighting. Generally,the Oxygen style manages to be attractive without becoming a distraction, butthere are still some placeswhere additional work is needed. This is most evident in a handful of preference dialogs where elements are cut off or notrendered correctly. In most of the major desktop applications, Oxygen looks nice. I was also impressed with the flexibility of Oxygen. The appearance preferences dialog gives users extensive control over the colors of specific elements. Oxygen looks great with bothdark and light color schemes.

In current versions of KDE, Konqueror is both a web browser and a file manager. In KDE 4, the focus for Konqueror has shifted towards browsing, whileDolphin has become the default file manager. When I first looked at Dolphin back in April, I really liked what I saw. The version of Dolphin included in the release candidate still serves up all the same great features and adds a few more nice ones, like a new column view that is similar to the Mac OS X Finder. Dolphin is powerful, intuitive, and a pleasure to use.

I tested several other applications as well, like Amarok 2 and KWord 2, both of which are impressive despite some rough edges. KWord 2's somewhat unusual user interface isintriguing, but clearly requires more polish to be truly useful. KWord 2 pushes most of the formatting features into a vertical dock that runs along the side of the window, which seemspractical for users with widescreen monitors. The Amarok 2 user interface has a lot of nice little aesthetic flourishes and iseasily the most polished of the KDE 4 applications.

Currently, the most significant weaknesses in KDE 4 arein the Plasma desktop layer. The Plasma components used for the desktop and the panel areundercooked. One of the primary features of Plasma is support for desktop-embedded widgets—called plasmoids—that are similar in nature to SuperKaramba widgets, Windows Sidebar gadgets, or Mac OS X Dashboard widgets. I tested several of the plasmoids that ship with the release candidate, including a clock, battery meter, calculator, and dictionary. On two occasions, adding a plasmoid to the desktop caused the screen to go black and the computer to become unresponsive, requiring me to forcibly kill and restart Xorg in order to continue with my tests. The plasmoids themselves are also somewhat buggy.

The panel at the bottom of the screen, which is also provided by Plasma, includes a menu button, task list, and notification area. I was unable to find any way to configure the panel, which is significantly less functional than the panel in the current version of KDE.

The second KDE 4 release candidate illuminates the extent to which KDE 4 has matured since the earlier betas, but a massive infusion of debugging and polish is needed before the release next month. Heavy development on KDE 4 will obviously continue after the KDE 4.0 release, so whatever pieces are still missing are sure to befilled in eventually. Some critics point to the deficiencies of KDE 4 and argue that drastic reinvention of basic desktop components might not have been a good idea. After experiencing KDE 4 myself, I have to disagree.

Transitions are always hard, butwhen the dust settles, a clean break between versions and an opportunity to introduce some innovative new ideasshouldlead to a stronger user experience. After years of development, unnecessary cruft builds up and things tend to get disorganized. The KDE 4 transition, though it will definitely be rocky at first, givesdevelopers the ability to cut awaythe cruft and reorganize codein a manner that makes the whole environment more future-proof and easier to maintain.

Evidence of the advantages might not be readily apparent to end users yet, but there are plenty ofsweet improvementsunder the hood in KDE 4. Developer-oriented changes like the much-needed migration from Autotools to CMake, the shift from Qt 3 to Qt 4, or the move from DCOP to D-Bus, for instance, all offer very real advantages.

My experience with KDE 4 revealed a set of hairy,nasty warts thatneed to be resolved, but it also shonesome light on some impressive improvements. Although I'm skeptical that all of the problems with the release candidate can be fixed in only one month, a rough first releaseseems a small price to pay for the significant long-term advantages offered by the transition.

Microsoft expands XNA development platform with Live functionality

Posted on 08/08/2019 by

Running with the success of XNA, Microsoft's open-ended development platform for use with PC and Xbox 360 game development, the company has moved the suite to the next phase with the launch of the XNA Game Studio 2.0. The new version allows developers to leverage the proprietary technologies of Xbox Live and Games for Windows Live in their creations. The new software is now available for download from the XNA Creator's Club web site. HangZhou Night Net

The headlining feature of the latest build is the addition of networking support to the XNA Framework API for both Xbox Live and Games for Windows Live. The cross-platform networking protocol enables "developers to create multiplayer games in which players on separate machines can compete with each other or play cooperatively."

Also included in the update are a variety of equally significant features, including a new high-level "Game" application model and the opening of the "Guide" class to developers. As Xbox 360 users will know, the Guide is part of the underlying Xbox 360 operating system which allows manipulation of the device, including text entry, storage drive selection, and more. The updated API also includes new methods for detecting alternative input mediums, including the Xbox 360 chatpad.

The XNA platform launched publicly in March of 2006 and was an unusual step forward for the company. The typically closed-platform nature of console development was opened slightly for independent and hobbyist developers who were interesting in developing for the increasingly popular platform in a cost-effective and relatively simple way, and cross-platform development was made easier for companies looking to release Xbox 360 and PC software as cost-effectively as possible.

The platform itself leverages a variety of other Microsoft technologies, including Visual Studio and subsequently C#. In fact, the only real downside (from an end-user perspective, anyway) is the $99 annual fee required to have access to the developer network: the one stipulation that prevents the service from being truly open to the masses.

Nevertheless, the platform has been enjoying success in both real-world and academic applications, and this latest advancement will only serve to further the proliferation of Microsoft's development platform.