The race to develop cutting-edge artificial intelligence (AI) technologies has spurred fierce competition among tech giants. Microsoft’s Azure OpenAI service has been a dominant player in the AI industry, but a new partnership between Amazon Web Services (AWS) and Hugging Face aims to offer an alternative.
AWS and Hugging Face recently announced a collaboration to bring Hugging Face’s popular natural language processing (NLP) technology to AWS customers. This partnership offers an affordable and user-friendly alternative to Azure OpenAI, opening up new possibilities for businesses and developers looking to integrate NLP into their products and services.
Hugging Face and AWS are joining forces to offer LLMs and generative AI models through AWS’ ML platform. Amazon SageMaker, AWS Trainium, and Inferentia offer the capabilities necessary for Hugging Face’s community to train the models, adjust them according to the requirements and deploy them.
At the recently concluded re: Invent event, there was much expectation from AWS heading in, owing to the increasing attention that LLMS have received lately. This encouraged AWS to enter into an exclusive partnership with Hugging Face, characterized by a dynamic shift in the industry dynamics these past few months.
Microsoft’s collaboration with OpenAI has been receiving a lot of attention lately. Edge, Microsoft’s browser, and Bing, its search engine, have taken advantage of OpenAI’s supreme generative AI model- chatGPT.
At last, Re: Invent, AWS surprised the industry by announcing a new and unexpected partnership with Hugging Face, especially considering the rising popularity and publicity around LLMs in recent months. Dynamics in this space have been evolving rapidly, motivating AWS to take this action.
Microsoft recently collaborated with OpenAI in a major way, evidenced by the launch of the Microsoft Edge browser, Bing search engine, and OpenAI’s chatGPT AI model integrated with them. This partnership has come at a reasonable time and is showing promise with the efficacy of its conventional AI model.
Azure OpenAI service is now available for developers to access, and Microsoft has leveraged this technology to integrate GPT-3 and chatGPT with Edge, Bing, and its own Microsoft 365 platform.
Azure OpenAI is a managed service that facilitates easy use of Open AI’s pre-trained models. Developers can leverage applications like GPT-3 and Codex through simple API calls without requiring detailed knowledge or extra effort.
Azure users running their hybrid workloads within VPCs now have an SLA-supported, secure, and private way to access OpenAI services: they can do this without using the public Internet. Microsoft specifically has extended its Azure Cognitive Services workloads to OpenAI.
Access to Microsoft’s Azure OpenAI service is generally available, yet the stringent access policy makes consumption through developers hard. To access this service, those interested must seek out Microsoft and submit documented use cases for further evaluation.
Before obtaining deployment of a solution in a production setting, all solutions that leverage Azure OpenAI must first undergo a review of their use case.
The availability of the cloud service Azure OpenAI has generated confusion and disappointment— due to Microsoft’s stringent restrictions concerning how customers utilize their AI services. Despite this being generally available to developers, Microsoft is carefully monitoring the use of its AI technology.
Microsoft is committed to promoting Responsible AI, a Standards Framework designed to help customers incorporate six key principles – fairness, safety & reliability, security & privacy, inclusiveness, transparency, and accountability – into the development of their Artificial Intelligence systems.
The emphasis on Responsible AI and the high cost associated with the AI infrastructure forced Microsoft to be restrictive in opening up Azure OpenAI to the world. LLMs such as GPT-3 depend on robust and scalable GPU infrastructure, which is expensive.
AWS has spotted an opportunity to partner with Hugging Face to let developers use Language Models (LLMs). The collaboration provides an open system different from Microsoft’s Azure OpenAI and its selective approach implemented in the case of the chatGPT service.
The same could be expected from AWS, as they will move like Microsoft approached Azure OpenAI – though their partnership doesn’t bring ready-to-use AI APIs yet.
The partnership between AWS and Hugging Face to offer an alternative to Microsoft’s Azure OpenAI service represents a significant development in the AI industry. The collaboration between AWS, one of the world’s largest cloud computing providers, and Hugging Face, a leading natural language processing technology provider, promises to offer a powerful platform for developing and deploying AI applications.
The strengths of both companies, combined with their commitment to making AI technology more accessible and affordable, make this partnership a strong contender in the industry. By offering an alternative to Azure OpenAI, the collaboration between AWS and Hugging Face will give businesses and developers more options for integrating NLP technology into their products and services.