About
Subscribe
  • Home
  • /
  • Software
  • /
  • Black Duck Assist: AI code security assistance in your IDE

Black Duck Assist: AI code security assistance in your IDE

By Patrick Carey, Executive Director of Product Marketing for Black Duck.
Johannesburg, 12 Aug 2025
AI increases the need for effective security testing.
AI increases the need for effective security testing.

Can I trust AI?

It seems that is the question we are all grappling with daily. For most of us, this question is rooted in concerns of data privacy, intellectual property, hallucinations and bias. But as a developer using AI coding assistants like GitHub Copilot and Claude Code, I have another, more specific question: can I trust the code AI generates?

And by trust, I mean, am I confident that the code I get from these tools is free from bugs – and, most importantly, security bugs. If we were asking this question of a human developer (eg: “Can we trust the code that Steve writes?”), our answer might be: “Probably, but we should still test it to verify.” Steve might be the best developer on the team, but few dev teams would be comfortable simply taking Steve’s word that his code is perfect.

AI-generated code: The same, yet different

The same “trust but verify” rule applies to AI coding assistants. While the quality of AI-generated code has improved dramatically, anyone using these tools will tell you that the code they generate is often far from perfect. And with the introduction of prompt-based (AKA “vibe”) semiautonomous coding agents, these bugs are more likely to go undetected by human developers who will be less familiar with the code because they didn’t write it themselves. Bottom line: AI doesn’t decrease the need for effective security testing. It increases it.

What’s more, the speed at which AI generates code means early detection of issues isn’t just nice to have, it’s essential to realise the productivity benefits of AI while managing the security risks. “Shift left” application security alone isn’t good enough. Application security checks now need to be done in real-time, the instant code is generated.

Black Duck Assist: Your AI code security companion

This is why Black Duck is excited to announce the integration of Black Duck Assist, the company's AI application security assistant, into its Code Sight IDE Plug-in. For developers, Black Duck Assist is like having their own application security expert working with them, helping to ensure the code that they, and their AI coding assistants, write can be trusted to be free from security defects.

Black Duck Assist works alongside Black Duck's powerful static analysis engines in Code Sight and the Black Duck Polaris Platform. Developers can scan on-demand, during file saves or automatically as code is being written – or generated – to flag potential security issues within seconds.

When defects are found, Black Duck Assist uses Black Duck's expert AI model to give developers everything they need to fix them quickly, including

  • Easy-to-understand issue explanations so developers don’t lose time trying to grok CWE descriptions.
  • Code analysis in context so they can see what they did wrong and how to avoid making the same mistake twice.
  • AI-generated code fixes so they can fix issues quickly and reliably.

Now developers can code, test and fix at the pace of AI, keeping momentum and remediating issues before they move on to other tasks. With Black Duck Assist, application security is literally working for developers.

Vibe coding, meet vibe security

You’ve no doubt heard the term “vibe coding”. Essentially, this is a model where the developer doesn’t just use AI to generate a block of code but inputs natural language to direct agentic AI to build complex code modules, or even a full application. Recent demos by GitHub and others show that vibe coding isn’t just a theory. It’s a near-term reality.

But if you can give natural language instructions to your AI coding assistant, why not your AI code security assistant? While you may not yet be ready for vibe coding, much less “vibe security”, with Black Duck Assist you can use natural language queries in both Code Sight and Polaris to get answers to questions about your security testing. We’re in early stages with this capability, but soon you’ll be able to use it to do everything from configuring and running scans to generating reports and trend analysis for security test results.

Code Sight with Black Duck Assist is available with Polaris today, for use in the most widely used IDEs including Visual Studio Code, IntelliJ and Eclipse. Black Duck has also added support for the Cursor and Windsurf AI code editors, for teams fully embracing AI-driven software development.

For more information, please contact Altron Arrow.

Share

Editorial contacts