Amazon Web Services’ latest managed service, designed to process and stream terabytes of data in real-time, is now publicly available.
Amazon Kinesis should allow users to store data from financial transactions, social media feeds and location-tracking events from thousands of sources, and send data to other big data services such as Amazon Simple Storage Service (Amazon S3) and Amazon Elastic Map Reduce.
Amazon claims the solution, which was released in November, provides easy data ingestion, high data durability and the ability to scale from kilobytes to terabytes at reduced costs.
Terry Hanold, VP of Cloud Commerce at AWS, said: "When we set out to build Amazon Kinesis, we wanted to eliminate the cost, effort, and expertise barriers that have prevented our customers from processing streaming data in real-time.
"We’ve gotten great feedback from our preview customers, and it’s inspiring to see the innovative ways customers are using Amazon Kinesis, across applications as diverse as gaming, mobile, advertising, manufacturing, healthcare, e-commerce, and financial services."
The Amazon Kinesis client library simplifies load balancing, coordination, and fault tolerance, and developers can use AWS Auto Scaling to create the Amazon Elastic Compute Cloud (Amazon EC2) processing clusters, according to the company.
The solution also integrates with third-party products, giving developers control and freedom to choose their preferred method of data processing, including popular open source products.
Donnie Flood, VP of Engineering at Bizo, said: "An Amazon Kinesis-based pipeline allows us to replace our existing, batch-oriented, data ingestion and aggregation mechanism, which forms the backbone of our data pipeline and reporting infrastructure.
"This reduces our operational burden and frees up our engineers’ time to focus on building targeted advertising solutions for our clients while Amazon Kinesis does the heavy lifting of scaling elastically in response to our growing business."