Big Data is commonly defined by the four Vs: volume, variety, velocity, and veracity. Along with those, companies might encounter a fifth V: vexation. Managing even ordinary-scale data is a challenge. The scope of big data means the challenges scale with the data. Managing it successfully requires carefully selecting and leveraging tools to replace the V for vexation with the V for victory.

Because of the scale of big data, most organizations turn to the cloud for storage and data management. Amazon Web Services (AWS) offers tools to help cope with each of those Vs.

Volume and Variety

The first characteristic of big data is simply that it’s big; more than that, it’s enormous. AWS helps businesses get the data into AWS and then offers multiple ways of storing the data that let you manage both the volume and variety of the data. Start with AWS Direct Connect and Amazon Snowball to help get data out of your data center into the cloud.

Once in the cloud, Amazon Simple Storage Service (S3) offers scalable object storage. For businesses wanting a traditional data warehouse model, Amazon Redshift scales to petabytes. Amazon Aurora provides relational database storage that’s compatible with MySQL; the Amazon Relational Database Service also supports Oracle, Microsoft SQL Server, and Postgres in the cloud. Amazon also offers support for several NoSQL databases, including DynamoDB and HBase, part of the Hadoop ecosystem.


Big data velocity challenges come from needing to get answers in any mode from batch to real-time. Amazon offers specialized compute instances that provide powerful environments optimized for high performance with big data. On the software side, AWS Elastic MapReduce supports Hadoop, Spark, and other popular big data processing frameworks. Artificial intelligence is available to big data applications through Amazon Machine Learning. For real-time processing, the AWS Kinesis services provide the ability to load streaming data and process it in real time.


Veracity is perhaps the toughest of the four Vs to cope with. There’s too much data coming in too fast in too many formats for manual validations of its correctness. While you can build data validations into your data load process with tools like AWS Data Pipeline, dealing with veracity in big data is largely an organizational and cultural change rather than a technical problem. Business units need to develop a data culture that recognizes the value of the data they collect to the business—even the data that seems peripheral to the business function—and to take ownership of the quality of their data.


Having the right tools on hand only gets you started down the path to victory; using them effectively requires knowledge and skill. dcVAST offers Managed Amazon Web Services that help businesses successfully leverage the facilities provided by AWS to achieve business success. Contact us to learn more about Amazon Web Services and how AWS and dcVAST can help you get your big data challenges under control.