The Week in Vis

Print Friendly, PDF & Email

For this week in Vis, we talk about the use of simulations in Formula 1, shattering objects, and bringing down buildings.

Red Bull Technology uses HPC & FieldView in Formula 1

Red Bull Technology’s Formula 1 car just brought home a 1-2 finish at the Chinese Grand Prix in Shanghai in April, and another 1-2 finish at Silverstone in June. How did they do it? Well, FieldView from Intelligent Light gave them insight into the CFD models used to simulate their cars, helping them edge out the competition.

Learning from high-fidelity computational fluid dynamics (CFD) simulations and rapidly turning the results into design decisions is the rule at Red Bull Technology. With several thousand cores of high performance computing (HPC) capacity churning out simulation results 24 hours a day, converting the solution data into key metrics, launching comparative studies, and understanding important flow phenomena requires robust, efficient tools. Red Bull Technology relies on FieldView(TM) CFD post-processing software from Intelligent Light to turn simulation data into actionable information in this highly productive CFD workflow.

John Carmack talks about the iPhone 3GS Hardware

John Carmack of id software, now part of Bethesda, sits down with Toms Hardware to talk about the iPhone 3GS and what can be done with the new OpenGLES2.0 hardware, and he’s excited about the possibility.

In fact, he proclaims, “now I am very excited about what I can do from a hardware and graphics standpoint with the 3GS. With vertex fragment shaders and OpenGL 2.0, I’m pretty convinced that I can actually run the MegaTexture id Tech 5 content creation pipeline on there. And I’m not sure what game I want to do that with yet, but the combination of seeing people download 700mb files of Myst on there, and the new capabilities, I could do some mind-blowingly cool stuff on there.”

id has already released a new Doom game for the iPhone, and claims they are working on a new iPhone Doom Trilogy.

Gaikai demos remote-based Game Rendering

Previously mentioned Gaikai has just released a video demonstration showing how it works, and how well. The blog post accompanying the video has some interesting details:

(1) No installing anything. (I’m running regular Windows Vista, with the latest Firefox and Flash is installed.)

(2) This is a low-spec server, it’s a very custom configuration, fully virtualized. Why? To keep the costs to an absolute minimum. We had 7 Call of Duty games running on our E3 demo server recently.

(3) Data travel distance is around 800 miles (round trip) on this demo as that’s where the server is. I get a 21 millisecond ping on that route. My final delay will be 10 milliseconds as I just added a server in Irvine California yesterday, but it’s not added to our grid yet. (So this demo is twice the delay I personally would get, the good news is I don’t notice it anyway.)

(4) This server is not hosted by a Tier 1 provider, just a regular Data Center in Freemont California. Also, I’m not cheating and using fiber connections for our demos. This is a home cable connection in a home.

(7) Our bandwidth is mostly sub 1 megabit across all games. (Works with Wifi, works on netbooks with no 3D card etc.)

Check out the video after the break. It looks like they might really have something here. The demo shows Spore (PC version), Mario Kart 64 (in an Emulator), and several others all running within FireFox.

ILM built the Transformers to withstand IMAX scrutiny

ILM built the models in the new Transformers movie to withstand scrutiny in IMAX theaters. The process, with all of the gory details, is cataloged over at StudioDaily.

There’s a sequence in DreamWorks/Paramounts’ Transformers: Revenge of the Fallen in which the Devastator, a gigantic Decepticon robot, scales an Egyptian pyramid as easily as a gorilla climbing a tree and rips off the top. With over 52,632 parts and nearly 12 million polygons, the robot is the biggest model Industrial Light & Magic has built in its 30 years of model-making. And they built the bot for IMAX shots.

The Molecule offers advice on working with large 3D Models & Simulations

The Molecule has a new blog post up talking about the problems they experienced doing the 3D Modeling and simulation work for “The Detonators” on Discovery Channel. Each of the 20 simulations took multiple layers (five to eight) plus 2D overlays, and they found alot of interesting tricks along the way.

Additionally, it is important to note that the layers were not all rendered at the same resolution–occlusion and shadow passes were amongst the slowest, and rendering them at half- or quarter-resolution would allow us to decrease the render time by a factor of 4 or 16, with no noticeable difference in final image quality.

NASA & Japan release the world’s most complete Topographic Map

Combining 1.3million individual stereo-pair images collected by the Japanese Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) aboard Terra NASA & Japan’s Ministry of Economy, Trade and Industry (METI), the most complete digital elevation map ever made was constructed.

“This is the most complete, consistent global digital elevation data yet made available to the world,” said Woody Turner, ASTER program scientist at NASA Headquarters in Washington. “This unique global set of data will serve users and researchers from a wide array of disciplines that need elevation and terrain information.”

Previously the best maps only covered 80% of the Earth’s landmass. You can see visualizations of the data at the here, and download the model here.