AI data points Artificial intelligence is a technology that’s becoming increasingly important in our lives, and Microsoft is leading the way in AI development.

But that’s not the only way Microsoft is guiding the way with artificial intelligence; they’re also taking the reins when it comes to safe and responsible use.

Let’s take a closer look at Microsoft’s use of AI and three ways they’re using it for responsible development.

Updates to Microsoft’s Responsible AI Standard

In a recent blog post, Microsoft outlined its ethical principles for AI development and use.

The company says it wants artificial intelligence to be a positive force in the world and recognizes the potential for misuse. As a result, Microsoft is committed to developing AI in a transparent, accountable, and respectful way of people’s rights and dignity.

The Responsible AI Standard lays out Microsoft’s approach to AI, emphasizing ethics, privacy, transparency, and accountability. The company also says that it will continue to work with partners and customers to ensure that AI is used responsibly.

These principles provide a strong foundation for Microsoft’s AI work and set the stage for responsible innovation in this exciting field. This move also shows that Microsoft is taking a responsible leadership position on AI, ensuring that the technology is used in a way that benefits everyone.

3 Ways Microsoft is Leading the Way with Responsible AI Development

The Responsible AI Standard principles are based on the belief that AI should be used to amplify humans’ abilities and empower everyone to achieve more. In addition to laying out the company’s core values, the principles provide guidance on how Microsoft will design and deploy AI technologies.

For example, the company is committed to ensuring that AI systems are accessible and inclusive, protecting user privacy, and being transparent about how AI systems make decisions. By articulating its values and aspirations for AI, Microsoft is setting a high bar for responsible development and use of this transformative technology.

1. Retiring & Reworking Azure Face

Microsoft recently announced that it would be retiring and reworking an artificial intelligence tool designed to recognize human facial features and identify emotions.

The tool, called Azure Face, is part of the company’s Cognitive Services suite of tools. Microsoft said that the decision to retire the tool was based on customers’ feedback that they were uncomfortable with the tool’s ability to identify individuals in photos.

Microsoft also said it is working on a new version of the tool to address these concerns. As such, Azure Face will no longer be available to new users and future partners will have to apply to use the updated tool.

2. Limiting Custom Neural Voice service

Azure’s artificial intelligence suite also includes Custom Neural Voice: a speech technology that takes the original source and recreates it to sound nearly identical.

While there are powerful education, accessibility, and even entertainment implications, there are also opportunities for misuse and even fraudulent use.

As such, Microsoft has determined more limited use is best, offering businesses who’d like to use the technology the opportunity to apply and disclose their plans for Custom Neural Voice use.

3. Tenant Allow/Block List Updates in Microsoft 365

Fraud and threat protection doesn’t end there. Microsoft is also making updates to the Tenant Allow/Block List.

Previously, if you blocked a contact, their future emails would no longer reach you. Now, with these updates, phishing scams will be further reduced.

If you use Microsoft 365, the Tenant Allow/Block List now has another layer of protection: you won’t be able to send emails to these contacts either.

 

Microsoft has long been a leader in AI technology and has now taken a stand on how that technology should be used with its Responsible AI Standard and other valuable updates.

Looking for a customized IT solution for your business? Don’t hesitate to schedule a free consultation with us at ClearTech Group.