View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Data
December 18, 2011

Fujitsu handles parallel processing, Big Data load fluctuations with new processing technology

Analyses time series data and puts the results to use

By CBR Staff Writer

Fujitsu and Fujitsu Laboratories together have developed complex event processing technology designed for use with cloud technology that employs distributed and parallel processing, which enables adjustment to fluctuations in data loads when processing massive amounts of heterogeneous time series data, now popularly known as "big data."

Complex event processing is a method for mining big data in real time to extract useful information. Methods of processing a large volume of data using a database, for which the data first need to be stored on some medium, such as a disk, are therefore not suitable for real-time processing.

Applying distributed and parallel processing technology to complex event processing enables greater granularity in processing and with dynamic distribution during execution and high speeds, immediate adjustments in load fluctuations were achieved without the need to pause processing.

Complex event processing technology analyses time series data and puts the results to use and the technology enhances the granularity of processing and can transfer this finer processing across servers, as well as be configured based on technology that selects the optimum candidate processing to be transferred.

The new technology is able to apply distributed and parallel processing technology to address load variations and the unit-based management of complex event processing takes each query and the data parallelisation of each query, and then refines it into smaller parts.

In addition, the complex event processing enables the effective distribution of loads and that migrates processing tasks for which the impact of the migration is lowest, based on the speed of load fluctuations, the properties of each event or query, or the status of the processing load.

With the new technology, it will be possible to develop services that provide non-stop analysis, in real time using only the necessary resources, of the large-scale time series data by leveraging in-house facility systems. In addition, given the flexibility to flexibly alter configurations, the need for strict estimates of the resources is eliminated and will be accessible for even small-scale operations.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU