The bigger Big Data gets, the more need, it seems, developers have for complex event processing outside of Hadoop.

In its 2014 Big Data & Advanced Analytics Survey, market research firm Evans Data found that only 16% of developers said Hadoop batch processing was satisfactory in all use cases. Seventy-one percent of the more than 400 developers surveyed worldwide also expressed a need for real-time complex event processing more than half the time in their applications, and 27% said they use it all the time.

(Related: A chat about managing and scaling ‘agile Big Data’ in the cloud)

“Hadoop’s batch-processing model has worked well for several years,” said Janel Garvin, CEO of Evans Data. “But the demands and challenges of Big Data in our current world mean real-time event processing is becoming more necessary, which is why Apache Storm and other commercial CEP products are being welcomed into the Big Data landscape.”

The survey also highlighted the biggest developer needs to maintain security and data privacy when working with Big Data. High-speed querying topped the list, with 32% of developers listing it as an area of need. Real-time correlation and anomaly detection was the next highest priority for 24% of developers.

Other trends identified in the survey are the need for expanded storage, with nearly 90% of developers stating their belief of a coming 50%-75% increase in the size of their companies’ data stores. The survey also covered the use of sharding—or divvying up databases—in different-sized companies. Specifically, only 13% of Big Data developers in small companies are currently using database shards, compared with 28% in medium-sized companies and 20% in large companies.

Evans Data’s 2014 Big Data & Advanced Analytics Survey covers additional topics including platforms and tools, data visualization and data modeling, database technologies, Big Data and Internet of Things and more.

The full report is available here.