Big Workflow: More than Just Intelligent Workload Management for Big Data

White Papers > Big Data > Big Workflow: More than Just Intelligent Workload Management for Big Data

Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to interoperate easily and automatically within existing IT infrastructure.

A new approach, known as “Big Workflow,” is being created by Adaptive Computing to address the needs of these applications. It is designed to unify public clouds, private clouds, Map Reduce-type clusters, and technical computing clusters. Specifically Big Workflow will:
• Schedule, optimize and enforce policies across the data center
• Enable data-aware workflow coordination across storage and compute silos
• Integrate with external workflow automation tools
Such a solution will provide a much-needed toolset for managing big data applications, shortening timelines, simplifying operations, and maximizing resource utilization, and preserving existing investments.

Error: Contact form not found.

All information that you supply is protected by our privacy policy. By submitting your information you agree to our Terms of Use.
* All fields required.