Cloud StorageDigitalizationPress Releases

Partnerships Between Cloudflare and Databricks are Expected to Bring AI Interference to The Edge Through MLflow and Databricks Marketplace

Cloudflare, Inc. (NYSE: NET), the leading connectivity cloud company, today announced a continued collaboration with Databricks, the Data and AI company, to bring MLflow capabilities to developers building on Cloudflare’s serverless developer platform. Cloudflare is joining the open source MLflow project as an active contributor to bridge the gap between training models and easily deploying them to Cloudflare’s global network, where AI models can run close to end-users for a low-latency experience.

As more businesses look to leverage AI to augment their products or processes, today there are many steps required to make it work end to end—from data collection, to storing data, using it for training models, and then deploying those models for inference.
Databricks and Cloudflare already collaborate to simplify the AI lifecycle by making sharing data simpler and more affordable through Delta Sharing with R2 storage. Cloudflare’s R2 is a zero egress, distributed object storage offering, allowing data teams to share live data sets and AI models with Databricks, easily and efficiently, eliminating the need for complex data transfers or duplications of data sets, and with zero egress fees.

In this new phase of partnership and collaboration on MLflow, Cloudflare and Databricks are closing the loop on how AI models are quickly and easily deployed to the edge. MLflow is an open-source platform for managing the machine learning (ML) lifecycle, created by Databricks. It has become a leading platform for end-to-end MLOps, enabling teams of all sizes to track, share, package, and deploy any model for batch or real-time inference. With this new partnership, developers will be able to train models utilizing Databricks’ data-centric AI platform, then deploy those models, to Cloudflare’s developer platform and global network, where hyper-local inference is deployed on the edge, to complete the AI lifecycle.

Developers building on Cloudflare Workers AI will be able to leverage MLflow compatible models for easy deployment into Cloudflare’s global network. Developers can use MLflow to efficiently package, implement, deploy and track a model directly into Cloudflare’s serverless developer platform.

“Cloudflare’s new Workers AI platform is the first end-to-end serverless solution to deploy AI at the edge,” said Matthew Prince, CEO and co-founder, Cloudflare. “Together with Databricks we can offer a comprehensive and optimized platform to support a wide spectrum of AI workflows, models and applications.”

“We are excited that Cloudflare will be supporting MLflow, enabling customers to quickly and easily deploy the models they trained on Databricks directly to the edge with Cloudflare,” said Craig Wiley, Sr. Director of AI/ML Product at Databricks. “Cloudflare’s new edge deployment capabilities expand the value of MLflow to a broad set of new use cases.”
To learn more, please check out the resources below:

 

DSA Editorial

The region’s leading specialist IT news publication focused on Data Lifecycle, Storage Infrastructure and Data-Driven Transformation. DSA has nearly 17,000 e-news subscribers, over 6500 unique visitors per day, over 20,000 social media followers and a reputation for deep domain knowledge.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *