microsoftazure
By Simon Bisson Microsoft has been polishing up its AI-powered Copilot in Azure for months now, and finally decided it’s ready for everyone to use. A public preview of Copilot in Azure will roll out relatively quickly, over a couple of weeks. If you’re not able to access Copilot in Azure immediately, rest assured you should see it in your Azure Portal soon, where it can help you manage, secure, and tune your Azure cloud infrastructure. I talked with Erin Chapple, Microsoft CVP, Azure Core Product and Design, about the new service and where it’s likely to go in the future. Like Microsoft’s othe...
Info World
By Simon Bisson Microsoft has been polishing up its AI-powered Copilot for Azure for months now, and finally decided it’s ready for everyone to use. A public preview of Copilot for Azure will roll out relatively quickly, over a couple of weeks. If you’re not able to access Copilot for Azure immediately, rest assured you should see it in your Azure Portal soon, where it can help you manage, secure, and tune your Azure cloud infrastructure. I talked with Erin Chapple, Microsoft CVP, Azure Core Product and Design, about the new service and where it’s likely to go in the future. Like Microsoft’s o...
Info World
By Simon Bisson Both extremely promising and extremely risky, generative AI has distinct failure modes that we need to defend against to protect our users and our code. We’ve all seen the news, where chatbots are encouraged to be insulting or racist, or large language models (LLMs) are exploited for malicious purposes, and where outputs are at best fanciful and at worst dangerous. None of this is particularly surprising. It’s possible to craft complex prompts that force undesired outputs, pushing the input window past the guidelines and guardrails we’re using. At the same time, we can see outp...
Info World
By Simon Bisson How do we ensure that the code we’re installing is, at the very least, the code that a vendor shipped? The generally accepted solution is code signing, adding a digital signature to binaries that can be used to ensure authorship. At the same time, the signature includes a hash that can be used to show that the code you’ve received hasn’t been altered after it’s been signed. Code signing is increasingly important as part of ensuring software bills of materials and reducing the risks associated with malware hijacking legitimate binaries. Signing is necessary if you’re planning on...
Info World
By Simon Bisson Once you get past the chatbot hype, it’s clear that generative AI is a useful tool, providing a way of navigating applications and services using natural language. By tying our large language models (LLMs) to specific data sources, we can avoid the risks that come with using nothing but training data. While it is possible to fine-tune an LLM on specific data, that can be expensive and time-consuming, and it can also lock you into a specific time frame. If you want accurate, timely responses, you need to use retrieval-augmented generation (RAG) to work with your data. RAG: the h...
Info World
By Paul Krill GitHub Actions, an automated CI/CD platform for GitHub, has been enhanced for enterprise customers, with capabilities including stronger security and GPU-enhanced runners for machine learning. GitHub announced updates to its hosted runner fleet for Actions on April 2. To strengthen security, GitHub Actions now offers Azure private networking for GitHub-hosted runners. The feature combines compute-in-the-cloud with secure access and control over network security, eliminating the overhead of maintaining infrastructure. Hosted runners for every major operating system are intended to...
Info World
By Paul Krill Microsoft is adding safety and security tools to Azure AI Studio, the company’s cloud-based toolkit for building generative AI applications. The new tools include protection against prompt injection attacks, detection of hallucinations in model output, system messages to steer models toward safe output, model safety evaluations, and risk and safety monitoring. Microsoft announced the new features on March 28. Safety evaluations are now available in preview in Azure AI Studio. The other features are coming soon, Microsoft said. Azure AI Studio, also in preview, can be accessed fro...
Info World
By Simon Bisson With KubeCon Europe taking place this week, Microsoft has delivered a flurry of Azure Kubernetes announcements. In addition to a new framework for running machine learning workloads, new workload scheduling capabilities, new deployment safeguards, and security and scalability improvements, Microsoft has placed a strong emphasis on developer productivity, working to improve the developer experience and helping reduce the risks of error. Prior to the event I sat down with Brendan Burns, one of the creators of Kubernetes, and now CVP, Azure Open Source and Cloud-Native at Microsof...
Info World
By Simon Bisson Falco, the open-source, cloud-native, runtime security tool, recently graduated from the Cloud Native Computing Foundation’s incubation program. That means it’s considered stable and ready for use in production environments, including Azure. It joins many of the key components of a cloud-native platform including Helm, Envoy, etcd, KEDA, and Cloud Events. I recently had a conversation with Loris Degioanni, the CTO and founder of cloud-native security company Sysdig and the creator of Falco, about the philosophy behind the project and how it’s being used across Kubernetes applic...
Info World
By Paul Krill Low-code development platform provider OutSystems has released AI Agent Builder, a no-code tool for building custom generative AI agents using large language models (LLMs) from Azure OpenAI or Amazon Bedrock. Part of the OutSystems Developer Cloud Platform and announced March 12, AI Agent Builder is intended to make it easy to incorporate generative AI-powered applications into a digital transformation strategy and govern the use of AI for standardization and security, the company said. Key features of AI Agent Builder include custom AI agent development, a library of “quickstart...
Info World
閲覧を続けるには、ノアドット株式会社が「プライバシーポリシー」に定める「アクセスデータ」を取得することを含む「nor.利用規約」に同意する必要があります。
「これは何?」という方はこちら