Artificial Intelligence NEWS

Cloudflare allows building AI applications on its network

Cloudflare

The company has introduced Workers AI for end-to-end infrastructure needed to scale and deploy AI models efficiently and affordably for the next era of AI applications.

Cloudflare has announced that developers can now build full-stack AI applications on Cloudflare’s network. Cloudflare’s developer platform will provide the best end-to-end experience for developers building AI applications, enabling fast and affordable inference without the need to manage infrastructure. Cloudflare’s platform is empowering developers with the velocity to ship a production-ready application quickly, with security, compliance, and speed built in.

“Cloudflare has all the infrastructure developers need to build scalable AI-powered applications, and can now provide AI inference as close to the user as possible. We’re investing to make it easy for every developer to have access to powerful, affordable tools to build the future. Workers AI will empower developers to build production-ready AI experiences efficiently and affordably, and in days, instead of what typically takes entire teams weeks or even months,” said Matthew Prince, CEO and co-founder, Cloudflare.

“As enterprises look to maximize their operational velocity, more and more of them are turning to artificial intelligence. But it’s critical to deliver a quality developer experience around AI, with abstractions to simplify the interfaces and controls to monitor costs. This is precisely what Cloudflare has optimized its Workers platform for,” said Stephen O’Grady, Principal Analyst, RedMonk.

Through significant partnerships, Cloudflare will now provide access to GPUs running on Cloudflare’s massive global network to ensure AI inference can happen close to users for a low-latency end-user experience. When combined with our Data Localization Suite to help control where data is inspected, Workers AI will also help customers anticipate potential compliance and regulatory requirements that are likely to arise as governments create policies around AI use. Cloudflare’s privacy-first approach to application development can help companies keep their promises to their customers by ensuring data used for inference is not used for training LLMs. Cloudflare currently supports a model catalog to help developers get started quickly, with use cases including LLM, speech to text, image classification, sentiment analysis and more.

Cloudflare has also introduced AI Gateway to make AI applications more reliable, observable, and scalable. According to the latest forecasts from IDC, AI spending is expected to balloon to $154 billion this year and increase to more than $300 billion in 2026. However, developers, and C-suite leaders have no way of understanding how money is being spent across AI infrastructure, or how many and from where queries are happening.

Related posts

Yellow.ai appoints new Chief Revenue Officer

Channel 360 MEA

Westcon-Comstor expands AWS Marketplace programme to Gulf region

Channel 360 MEA

Dataiku LLM Mesh Partner Program gets new addition

Channel 360 MEA

Leave a Comment