Black Duck SCA from Black Duck Software now supports identification and analysis of open-source AI models with the release of its “AI Model Risk Insights” capability (2025.10.0). This addition enables organisations to gain visibility into AI model usage such as versions, datasets, and hidden or modified models even if they are not declared in build manifests. It helps detect models from public repositories like Hugging Face Hub via signature-based “CodePrint scanning”, bringing license and metadata tracking (including model cards and training data insights) into the UI. It integrates into existing workflows through the platform’s BOM engine and supports compliance with regulatory regimes such as the EU AI Act and the U.S. Executive Order on AI by providing audit-ready reports on AI components.
Also Read: Action1 Extends Microsoft Intune with Unified Cross-Platform Patching and Risk-Based Vulnerability Management
“This innovation directly addresses the emerging security challenges of AI adoption,” said Jason Schmitt, CEO at Black Duck. “With the introduction of AI model scanning, Black Duck SCA is setting a new standard for software composition analysis… empowering companies to confidently integrate AI models securely while maintaining compliance and regulatory adherence.” The feature is offered as a new licensed add-on, reinforcing Black Duck’s strategy of helping organisations build secure, compliant software in an increasingly AI-driven development environment.





























