Inside HPC & AI News | High-Performance Computing & Artificial Intelligence
At the Convergence of HPC, AI at Scale, Quantum
Subscribe
  • News
    • AI News
    • Business of HPC
    • New Installations
  • HPC-AI Hardware
    • Compute
    • CPUs, GPUs, FPGAs
    • Exascale
    • Future Technology
    • Green HPC
    • HPC/AI Chips and Systems
    • Network
    • Quantum Computing
    • Storage
  • HPC-AI Software
    • AI & Machine Learning
    • Cloud HPC
    • High Performance Analytics
    • Lustre
    • Parallel Programming
    • Systems Management
    • Tools
  • Quantum
  • Resources
    • Thought Leader Articles
    • Education / Training
    • Events Calendar
    • HPC Career Notes
    • Industry Perspectives
    • Industry Segments
      • Enterprise HPC
      • Financial Services
      • Government
      • Manufacturing
      • National Lab News
      • Research / Education
    • Jobs Board
    • Vanguards HPC-AI
    • Special Reports
    • The Exascale Report Archives
    • White Papers
  • Podcasts & Videos
    • @HPCpodcast
    • Other Podcasts
    • Videos
  • Power & Cooling
    • Advanced Tech & Efficiency
    • Air & Liquid Cooling
    • Data Center
    • Green Data Center
    • Infrastructure Design/Management
    • Interconnects & Networking
    • Nuclear, Solar, Wind, LNG, Geothermal, Fusion
    • Sustainability
    • System & Facility Monitoring
  • AI News
  • Search

SDSC Announces Comprehensive Data Sharing Resource

March 13, 2020 by staff
Print Friendly, PDF & Email
  • share 
  • share 
  • share  
  • share  
  • email 

SDSC has announced the launch of HPC Share, a data sharing resource that will enable users of the Center’s high-performance computing resources to easily transfer, share, and discuss their data within their research teams and beyond.

HPC users face a range of hurdles to share their data,” said SDSC Visualization Group Leader Amit Chourasia, also SeedMeLab’s Principal Investigator (PI). “Their collaborators may not have adequate context of the computed data, or may not be able to find, access, or fetch the data from HPC system. Collaborators also often have a pressing need to repeatedly find, access, and review data products in a fast-paced environment. These hurdles inevitably create a bottleneck at best, or in the worst case can cripple the scientific discovery process and also burden the principal HPC user to devote considerable time and effort to develop ad-hoc and poorly maintained data sharing methods rather than focusing on the research.”

HPC Share solves these hurdles by letting users accelerate the research pace and information exchange via a ready-to-use infrastructure. Its key capabilities include easy data transfer, accessibility and sharing via a web browser on any device, an ability to add annotation to any file or folder, discuss and visualize tabular data. HPC Share is available to all users of SDSC’s Comet supercomputer, with a potential expansion to SDSC’s Triton Shared Computing Cluster (TSCC) and its Expanse supercomputer slated to enter production later this year.

HPC Share bridges a major gap in our current cyberinfrastructure by offering a turnkey system that eliminates barriers among collaborating researchers so that they can quickly access and review scientific results with context,” said SDSC Director Michael Norman. “Users will benefit from reduced complexity, ubiquitous accessibility, and more importantly rapid knowledge exchange.”

HPC Share is powered by SDSC’s open-source SeedMeLab software that was developed with support from National Science Foundation (NSF). Its built-in web services, coupled with an API extension, make it a versatile platform to create branded data repositories for small research groups to large communities or integrate with existing research data flow while also serving as an important stepping stone for researchers to realize FAIR data management in practice.

March 19 Webinar

Researchers interested in learning more about HPC Share and SeedMeLab are invited to participate in a free webinar on March 19 that will cover the use of the data management content system on SDSC’s Comet supercomputer.

We have set up a dedicated instance of SeedMeLab for our HPC users,” said Chourasia, who will lead the webinar. “We will provide hands-on training to move, share, and present your data to and from Comet that will organize and store your data in one place while making it accessible to other researchers.” Details of the webinar can be found here.

In addition to Chourasia, the SeedMeLab project includes SDSC Director Michael Norman as co-PI and David Nadeau as a technical architect with SDSC. Master’s and undergraduate interns at UC San Diego’s Computer Science and Engineering Department, as well as regional high school students, have assisted in the project with prototype extensions and comprehensive quality assurance of the software.

Sign up our insideHPC Newsletter

  • share 
  • share 
  • share  
  • share  
  • email 
Filed Under: Cloud HPC, HPC-AI Software, Industry Segments, News, Research / Education, Storage Tagged With: comet supercomputer, HPC Share, SDSC, SeedMeLab
«
»
»
«

Sponsored Guest Articles

How MiTAC Helps Organizations Scale for Both AI Training and Inference

“Our design philosophy is centered around our customers. They need solutions that are not just technically advanced but also seamlessly integrated, easily scalable, and reliable.”

White Papers

The State of Data Innovation 2021

Our friends over at Splunk, along with researchers at the Enterprise Strategy Group, set out to measure data innovation, surveying 1,250 senior IT and business decision-makers worldwide, across industries, at larger organizations to assess their data practices, their innovation infrastructure, and their results. This report, “The State of Data Innovation 2021” summarizes the findings.

Download
More White Papers

Join Us On Social Media

Featured From
  • HIVE’s BUZZ Signs $30M in AI Cloud Contracts

    BUZZ High Performance Computing, a Canadian Tier-III high-performance computing data center platform of Hive Digital Technologies Ltd., signed customer agreements representing approximately $30 million over two-year fixed terms. Building on four years of experience operating GPU infrastructure, BUZZ is accelerating its expansion as HIVE’s AI engine, complementing the company’s Tier-I hashrate services provider and reinforcing […]

More News from insideAI News

  • Oak Ridge Partners with NuScale on AI-Guided Nuclear Fuel Management
  • Bull Accelerates Innovation with New High-Speed Interconnect Investment
  • DOE Launches Genesis Mission Consortium
  • HPC News Bytes 20260209: Who Has AI Sovereignty?, Bullish on Eviden, Intel’s Latest GPU Attempt, AI Predictions
  • HPC News Bytes 20260202: Microsoft’s Inference Chip, H200 (Not H20) GPUs for China, Mega AI Data Center Deals
  • Report: AI Scale Pushing Enterprise Infrastructure toward Failure
  • Argonne Partners with RIKEN, Fujitsu and NVIDIA on AI for Science and HPC
  • About insideHPC
  • Contact
  • Advertise with insideHPC
  • Visit Our Other Site – insideBIGDATA
  • Terms of Service & Copyright
  • Privacy Policy
Inside HPC & AI News | High-Performance Computing & Artificial Intelligence
Copyright © 2026