SageMaker - gettectonic.com
AI Services and Models Security Shortcomings

AI Services and Models Security Shortcomings

Orca Report: AI Services and Models Show Security Shortcomings Recent research by Orca Security reveals significant security vulnerabilities in AI services and models deployed in the cloud. The “2024 State of AI Security Report,” released in 2024, underscores the urgent need for improved security practices as AI technologies advance rapidly. AI Services and Models Security Shortcomings. AI usage is exploding. Gartner predicts that the AI software market will grow19.1% annually, reaching 8 billion by 2027. In many ways, AI is now inthe stage reminiscent of where cloud computing was over a decade ago. Orca’s analysis of cloud assets across major platforms—AWS, Azure, Google Cloud, Oracle Cloud, and Alibaba Cloud—has highlighted troubling risks associated with AI tools and models. Despite the surge in AI adoption, many organizations are neglecting fundamental security measures, potentially exposing themselves to significant threats. The report indicates that while 56% of organizations use their own AI models for various purposes, a substantial portion of these deployments contain at least one known vulnerability. Orca’s findings suggest that although most vulnerabilities are currently classified as low to medium risk, they still pose a serious threat. Notably, 62% of organizations have implemented AI packages with vulnerabilities, which have an average CVSS score of 6.9. Only 0.2% of these vulnerabilities have known public exploits, compared to the industry average of 2.5%. Insecure Configurations and Controls Orca’s research reveals concerning security practices among widely used AI services. For instance, Azure OpenAI, a popular choice for building custom applications, was found to be improperly configured in 27% of cases. This lapse could allow attackers to access or manipulate data transmitted between cloud resources and AI services. The report also criticizes default settings in Amazon SageMaker, a prominent machine learning service. It highlights that 45% of SageMaker buckets use non-randomized default names, and 98% of organizations have not disabled default root access for SageMaker notebook instances. These defaults create vulnerabilities that attackers could exploit to gain unauthorized access and perform actions on the assets. Additionally, the report points out a lack of self-managed encryption keys and encryption protection. For instance, 98% of organizations using Google Vertex have not enabled encryption at rest for their self-managed keys, potentially exposing sensitive data to unauthorized access or alteration. Exposed Access Keys and Platform Risks Security issues extend to popular AI platforms like OpenAI and Hugging Face. Orca’s report found that 20% of organizations using OpenAI and 35% using Hugging Face have exposed access keys, heightening the risk of unauthorized access. This follows recent research by Wiz, which demonstrated vulnerabilities in Hugging Face during Black Hat USA 2024, where sensitive data was compromised. Addressing the Security Challenge Orca co-founder and CEO Gil Geron emphasizes the need for clear roles and responsibilities in managing AI security. He stresses that security practitioners must recognize and address these risks by setting policies and boundaries. According to Geron, while the challenges are not new, the rapid development of AI tools makes it crucial to address security from both engineering and practitioner perspectives. Geron also highlights the importance of reviewing and adjusting default settings to enhance security, advocating for rigorous permission management and network hygiene. As AI technology continues to evolve, organizations must remain vigilant and proactive in safeguarding their systems and data. In conclusion, the Orca report serves as a critical reminder of the security risks associated with AI services and models. Organizations must take concerted action to secure their AI deployments and protect against potential vulnerabilities. Balance Innovation and Security in AI Tectonic notes Salesforce was not included in the sampling. Content updated September 2024. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Einstein Code Generation and Amazon SageMaker

Einstein Code Generation and Amazon SageMaker

Salesforce and the Evolution of AI-Driven CRM Solutions Salesforce, Inc., headquartered in San Francisco, California, is a leading American cloud-based software company specializing in customer relationship management (CRM) software and applications. Their offerings include sales, customer service, marketing automation, e-commerce, analytics, and application development. Salesforce is at the forefront of integrating artificial general intelligence (AGI) into its services, enhancing its flagship SaaS CRM platform with predictive and generative AI capabilities and advanced automation features. Einstein Code Generation and Amazon SageMaker. Salesforce Einstein: Pioneering AI in Business Applications Salesforce Einstein represents a suite of AI technologies embedded within Salesforce’s Customer Success Platform, designed to enhance productivity and client engagement. With over 60 features available across different pricing tiers, Einstein’s capabilities are categorized into machine learning (ML), natural language processing (NLP), computer vision, and automatic speech recognition. These tools empower businesses to deliver personalized and predictive customer experiences across various functions, such as sales and customer service. Key components include out-of-the-box AI features like sales email generation in Sales Cloud and service replies in Service Cloud, along with tools like Copilot, Prompt, and Model Builder within Einstein 1 Studio for custom AI development. The Salesforce Einstein AI Platform Team: Enhancing AI Capabilities The Salesforce Einstein AI Platform team is responsible for the ongoing development and enhancement of Einstein’s AI applications. They focus on advancing large language models (LLMs) to support a wide range of business applications, aiming to provide cutting-edge NLP capabilities. By partnering with leading technology providers and leveraging open-source communities and cloud services like AWS, the team ensures Salesforce customers have access to the latest AI technologies. Optimizing LLM Performance with Amazon SageMaker In early 2023, the Einstein team sought a solution to host CodeGen, Salesforce’s in-house open-source LLM for code understanding and generation. CodeGen enables translation from natural language to programming languages like Python and is particularly tuned for the Apex programming language, integral to Salesforce’s CRM functionality. The team required a hosting solution that could handle a high volume of inference requests and multiple concurrent sessions while meeting strict throughput and latency requirements for their EinsteinGPT for Developers tool, which aids in code generation and review. After evaluating various hosting solutions, the team selected Amazon SageMaker for its robust GPU access, scalability, flexibility, and performance optimization features. SageMaker’s specialized deep learning containers (DLCs), including the Large Model Inference (LMI) containers, provided a comprehensive solution for efficient LLM hosting and deployment. Key features included advanced batching strategies, efficient request routing, and access to high-end GPUs, which significantly enhanced the model’s performance. Key Achievements and Learnings Einstein Code Generation and Amazon SageMaker The integration of SageMaker resulted in a dramatic improvement in the performance of the CodeGen model, boosting throughput by over 6,500% and reducing latency significantly. The use of SageMaker’s tools and resources enabled the team to optimize their models, streamline deployment, and effectively manage resource use, setting a benchmark for future projects. Conclusion and Future Directions Salesforce’s experience with SageMaker highlights the critical importance of leveraging advanced tools and strategies in AI model optimization. The successful collaboration underscores the need for continuous innovation and adaptation in AI technologies, ensuring that Salesforce remains at the cutting edge of CRM solutions. For those interested in deploying their LLMs on SageMaker, Salesforce’s experience serves as a valuable case study, demonstrating the platform’s capabilities in enhancing AI performance and scalability. To begin hosting your own LLMs on SageMaker, consider exploring their detailed guides and resources. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
gettectonic.com