This article on CIO mentioned a survey where a third of CIOs were tackling cloud integration and implementation issues. Key among their headaches were greater than expected implementation and integration costs and the capability challenge of moving data into the cloud. Despite these unexpected costs, 70% of respondents still reported that cloud infrastructure had already delivered cost savings and significant efficiencies.

Cloud right now is pretty cool, but if you can sort out integration, it would be awesome.

Perhaps the most difficult part of cloud integration is the coding effort required to move data from a businesses on-premises servers to the cloud and deal with the data quality minefield. This requires manually coding data feeds and finding the staff who have experience in this newly-emerging field is a real challenge.

Wouldn’t it be great if a business could move data from local stores to the cloud without needing any additional coding skills more than writing a SQL query? And what if data errors were fixed automatically during the transfer? While you are there, you may as well automate the process so thousands of tables can be selected and moved into NoSQL stores in one step…

Our plan is to make moving data from the mess of databases, spreadsheets and text files into IaaS and PaaS data stores as easy as possible. We’ve built this into our existing workflow (here’s how it works now..) so you simply authenticate with your cloud provider and Conductor pushes you source data into the cloud with minimal effort.

If you have any questions, or want to see things in action before we launch the feature, feel free to get in touch I’d be happy to answer any questions.