Microsoft is hoping to address what it is calling the “data divide” with the Open Data Campaign. The new initiative aims to help organizations across the industry realize the benefits of data. According to Microsoft, while there has been tremendous growth in data and AI, these technologies are concentrated in just a few companies. For … continue reading
The latest version of Tasktop Hub is now available. Tasktop Hub 20.2 is designed to make toolchain integration easier by enabling large-scale organizations to accelerate the flow of work and business value across their software portfolio. Key highlights include more control over operational processes using Conditional Field Flow, conflict resolution at the field-level, enhanced support … continue reading
Microsoft has been collecting 13 million work items and bugs since 2001, and used that data to create a machine learning model to fight software bugs. According to the company, the model distinguishes between security and non-security bugs 99% of the time and identify the high-priority bugs 97% of the time. “At Microsoft, 47,000 developers … continue reading
It is common knowledge that the volume of data in existence is increasing exponentially, and many organizations are starting to struggle with the best way to cope with it. This trend will only accelerate with the rapid development of IoT, which according to IDC will generate 79.4 zettabytes (ZB) of data in 2025 from 41.6 … continue reading
The Open Mainframe Project is looking to fill a technology skills gap by providing COBOL resources to the public sector. According to the project, more than 10 million people in the United States have filed for unemployment due to the COVID-19 crisis, and there is an emergent need for COBOL programmers. “This pandemic underscores the … continue reading
The past year witnessed some of the biggest data breaches of all time and the rapid proliferation of APIs have created new challenges in approaching the security landscape as a developer. “The fallout from not integrating security early in the development lifecycle has never been more apparent,” the 2019 State of Software Security report stated. … continue reading
Until recently, data science was a mostly academic pursuit and the subject of papers rather than practice. Over time, data science became an applied science with data scientists being paired with data engineers to develop production systems. We are now entering a new phase where much of the work being performed by data scientists (hyperparameter … continue reading
As data becomes more important than ever to business success, modern organizations are constantly looking for tools and resources that can help them harness and make sense of that data. CData is making it easier for users to connect to data sources and tools with the release of hundreds of new Python Connectors, which are … continue reading
The White House is issuing a call to action for AI experts to develop new text and data mining techniques to analyze the newly released COVID-19 Open Research Dataset (CORD-19). The dataset is the most extensive machine-readable Coronavirus literature collection available was created with input from researchers and leaders from the Allen Institute for AI, … continue reading
Quick Base has announced a new way for business professionals to work with IT and test low-code applications. The new Sandbox capability enables cross-functional teams to quickly create and optimize business-critical applications without risking disruption. Sandbox provides a place to easily collaborate with IT when making changes to new and existing workflows, while giving IT … continue reading
CData announced a $20 million Series A investment round from Updata Partners that will be used to accelerate the rollout of new data connectivity solutions. The company offers real-time drivers and data connectivity solutions for hundreds of SaaS, NoSQL, and Big Data sources that enable modern and legacy applications to connect with cloud data. Updata … continue reading
The National Oceanic and Atmospheric Administration (NOAA) announced that it is ramping up its computing power with two new Cray supercomputers in Virginia and Arizona, each with 12 petaflops of capacity, bringing NOAA’s total power up to 40 petaflops. These computers will unlock new possibilities for better forecast model guidance through higher-resolution and more comprehensive … continue reading