Oracle kicks off 20-city cloud tour

Oracle today kicked off its 20-city worldwide tour in the furtherance of its cloud offerings. The first Oracle Code event today in San Francisco was keynoted by Thomas Kurian, president of Oracle product development. He walked attendees through the finer points of setting up systems and storage within the Oracle cloud. Kurian demonstrated specific capabilities … continue reading

Slack introduces Threads, Realm Mobile Platform reaches v1.0, and CA Technologies officially acquires Automic—SD Times news digest: Jan. 19, 2017

Slack has introduced a much-sought-after feature today called Threads, which lets users reply to messages, organize discussions and manage their conversations. According to Paul Rosania, a member of the core product team at Slack, the company was trying to implement a feature that would group conversations in a channel to make it clear which message … continue reading

NuoDB 2.6 released with more elastic SQL scale-out capabilities

NuoDB wants to help developers elastically scale out their cloud applications with the latest release of its SQL database. NuoDB 2.6 is designed to lower costs, improve availability, and support active-active database deployments and distributed storage. “The demand for cloud applications has never been higher as organizations rapidly modernize their infrastructure to keep pace with … continue reading

Analyst View: Data entry or data sensing?

Replacement of data entry with data sensing can enable better, faster, and more relevant results from application software. But it’s important for software architects to see that this change means they are setting a course for their application away from decision support and toward control. Find the data Let’s start with a manufacturing example. People … continue reading

Guest View: The first release of Apache Arrow

Work on Apache Arrow has been progressing rapidly since its inception earlier this year, and now Arrow is the open-source standard for columnar in-memory execution, enabling fast vectorized data processing and interoperability across the Big Data ecosystem. Background Apache Parquet is now the de facto standard for columnar storage on disk, and building on that … continue reading

How to deal with data at scale

The world increasingly runs on data, and that data is only expanding. Like the blob, it gets everywhere: storage systems, databases, document repositories. According to IDC, the world will hold 44 zettabytes of data by 2020, up from 4.4 zettabytes in 2013. That’s a lot of hard drives. It’s also a recipe for development and … continue reading

IBM’s Project Intu, Google Slides API, and Altova MobileTogether 3.0—SD Times news digest: Nov. 10, 2016

IBM is giving developers the ability to embed Watson’s cognitive functions in their end-user device solutions. The company announced the experimental release of Project Intu, a new platform that aims to help build the next generation of cognitive experiences. Developers can use Project Intu to develop for form factors such as wearables, avatars, robots and … continue reading

Redis adds Spark-based machine learning

Redis and Spark are coming a bit closer together. Redis Labs today announced a new project, Redis-ML, to bring Spark-based machine learning capabilities to the Redis database. The combination will provide a faster place to store a trained Spark machine learning model. Redis-ML is hosted on GitHub and will be demonstrated at the Big Data … continue reading

MongoDB can now store graph databases

Users of the MongoDB NoSQL data store can now store graph databases in it. Today the company announced the introduction of MongoDB 3.4, and with it came a host of new features, including the ability to host graph databases. Kelly Stirman, vice president of strategy and product marketing at MongoDB, said this update focuses on … continue reading

DeepMind AI model can learn from own memory, Zuckerberg seeks voice actor for home AI, and AWS/VMware deliver new vSphere cloud offering—SD Times news digest: Oct. 14, 2016

DeepMind, an artificial intelligence firm that is now under the Alphabet umbrella, has developed differentiable neural computers (DNCs), which can learn from examples like neural networks, but can store complex data like actual computers. When DeepMind designed DNCs, it wanted to have machines that can form and navigate complex data structures on its own. Inside … continue reading

Altova MissionKit enhances its Big Data, database and XBRL tools

Altova MissionKit’s latest release features changes to its Big Data, database, and XBRL tools, designed to increase an organization’s productivity. Altova’s last big release was in February, and it focused mostly on support for JSON and .NET APIs, introducing new tools for JSON and features to speed up JSON development. Altova MissionKit 2017 latest updates … continue reading

Apache CouchDB 2.0, CA to acquire BlazeMeter, and Datadog’s APM extension—SD Times news digest: Sept. 21, 2016

The open-source database Apache CouchDB has hit version 2.0. CouchDB is designed to scale from Big Data apps to mobile devices. “CouchDB 2.0 finally fulfills the project’s original vision to bring reliable data sync to Big Data and Mobile in one seamless solution,” said Jan Lehnardt, vice president of Apache CouchDB. “The team has been … continue reading

Next Page »
HTML Snippets Powered By :