Archives for December 2011

JP Morgan Fires Up Maxeler FPGA Super

Sooraj Shah writes that Investment bank JP Morgan has deployed a Maxeler dataflow supercomputer to manage the bank’s fixed income trading operations. Using FPGA technology, the system will be used for the analysis and profiling of intra-day trading risk. With the new Maxeler technology, JP Morgan’s trading businesses can now compute orders of magnitude more […]

Brothers Discover Maximal Information Coefficient – A Divining Rod for Big Data?

Phillip Ball of Nature reports that brothers David and Yakir Rashef have devised a statistical method that can spot many superimposed correlations between variables. Using this technique, their research team is able to measure exactly how tight each relationship is, on the basis of a quantity that the team calls the maximal information coefficient (MIC). […]

Contest Winner – Cuda on ARM Developer Kit to be Named CARMA

Clipped from: blogs.nvidia.com (share this clip) Thanks to its crowdsourcing contest, Nvidia has named its ARM developer kit CARMA. Powered by a Tegra 3 quad-core ARM-based processor and an NVIDIA CUDA-enabled GPU, the CARMA DevKit is being developed to support energy-efficient HPC projects using ARM-based GPU computing. In fact, this technology will power the Barcelona […]

Secrets and Solutions From a Reformed Benchmarketer

By Dan Olds, Gabriel Consulting • Get more from this author At SC11 I ran into Henry Newman, CEO of HPC consulting firm Instrumental Inc. After exchanging the usual pleasantries and deeply offensive personal insults, we got to talking about some of the recently released benchmark results – and how irrelevant most of them are to the […]

Job of the Week: Bioinformatics Computing Consultant at NERSC

NERSC is seeking a Bioinformatics Computing Consultant in our Job of the Week. NERSC and the Joint Genome Institute (JGI) are searching for an individual who can help biologists exploit advanced computing platforms. JGI provides production sequencing and genomics for the Department of Energy and has recently partnered with NERSC (www.nersc.gov), the DOE Office of Sciences flagship […]

Korean Tachyon II Supercomputer Runs Largest Simulation of the Universe

A Korean supercomputer called Tachyon II has completed the largest ever simulation of the universe. Ranked at #26 on the TOP500, Tachyon II took over 20 days to run the job on 26,232 processing cores. The purpose of the study – called Horizon Run 3 – was to run the birth and evolution of the […]

Video: Highlights of SC11 from the Swiss National Supercomputer Center Booth

httpv://www.youtube.com/watch?v=dfKzjh3aZrM&f In this video, Rich Brueckner from insideHPC meets with the team from the Swiss National Supercomputer Center to discuss the highlights of SC11.

HPC Advisory Council Switzerland Workshop, March 13-15, 2012

httpv://www.youtube.com/watch?v=f1yvR5Zmlcs#! In this video, Brian Sparks describes the upcoming HPC Advisory Council Switzerland Workshop on March 13-15, 2012 in Lugano. The Swiss National Supercomputing Center is an important partner for the HPC Advisory Board, and CSCS was named ther first center of excellence outside the USA. These centers provide essential tools and outbound activities for […]

Altair Fires Up HyperWorks On-Demand Cloud

This week Altair Engineering announced it has fired up new datacenter capacity for its HyperWorks On-Demand HPC Cloud. Powered by more than 10,000 cores, the HPC Cloud can support as many as 150 large-scale engineering solver jobs running simultaneously, employing Altair’s solvers RADIOSS, OptiStruct and AcuSolve along with other tools in the HyperWorks family of […]

Video: insideHPC Wraps up the GPU Technology Conference live from Beijing, China

httpv://www.youtube.com/watch?v=0E4dh2-dM90 In this video, Rich Brueckner from insideHPC wraps up the GPU Technology Conference live from Beijing, China.