Internet Archives - gettectonic.com - Page 2
AI Leader Salesforce

AI Leader Salesforce

Salesforce Is a Wild Mustang in the AI Race In the bustling world of artificial intelligence, Salesforce Inc. has emerged as an unsurpassed and true leader. “Salesforce?” one might wonder. The company known for its customer relationship management software? How can it be an AI leader if it is only focused on each department or division (or horse) is only focused on its own survival? AI Leader Salesforce. Herds of horses have structure, unique and important roles they each play. While they survival depends greatly on each members’ independece they must remain steadfast in the roles and responsibilities they carry to the entire herd. The lead stallion must be the protector. The lead mare must organize all the mothers and foals into obedient members of the herd. But they must all collaborate. AI Leader Salesforce To stay strong and competetive Salesforce is making bold strides in AI as well. Recently, the company became the first major tech firm to introduce a new class of generative AI tools known as “agents,” which have long been discussed by others but never fully realized. Unlike its competitors, Salesforce is upfront about how these innovative tools might impact employment. This audacious approach could be the key to propelling the company ahead in the AI race, particularly as newer players like OpenAI and Anthropic make their moves. Marc Benioff, Salesforce’s dynamic CEO, is driving this change. Known for his unconventional strategies that helped propel Salesforce to the forefront of the software-as-a-service (SaaS) revolution, Benioff has secured a client base that includes 90% of Fortune 500 companies, such as Walt Disney Co. and Ford Motor Co. Salesforce profits from subscriptions to applications like Sales Cloud and Service Cloud, which help businesses manage their sales and customer service processes. At the recent Dreamforce conference, Salesforce unveiled Agentforce, a new service that enables customers to deploy autonomous AI-powered agents. If Benioff himself is the alpha herd leader, Agentforce may well be the lead mare. Salesforce distinguishes itself by replacing traditional chatbots with these new agents. While chatbots, powered by technologies from companies like OpenAI, Google, and Anthropic, typically handle customer inquiries, agents can perform actions such as filing complaints, booking appointments, or updating shipping addresses. The notion of AI “taking action” might seem risky, given that generative models can sometimes produce erroneous results. Imagine an AI mishandling a booking. However, Salesforce is confident that this won’t be an issue. “Hallucinations go down to zero because [Agentforce] is only allowed to generate content from the sources you’ve trained it on,” says Bill Patterson, corporate strategy director at Salesforce. This approach is touted as more reliable than models that scrape the broader internet, which can include inaccurate information. Salesforce’s willingness to confront a typically sensitive issue — the potential job displacement caused by AI — is also noteworthy. Unlike other AI companies that avoid discussing the impact of cost-cutting on employment, Salesforce openly addresses it. For instance, education publisher John Wiley & Sons Inc. reported that using Agentforce reduced the time spent answering customer inquiries by nearly 50% over three months. This efficiency meant Wiley did not need to hire additional staff for the back-to-school season. In the herd, the leader must acknowledge some of his own offspring will have to join other herds, there is a genetic survival of the fittest factor. I would suspect Benioff will re-train and re-purpose as many of the Salesforce family as he can, rather than seeing them leave the herd. Benioff highlighted this in his keynote, asking, “What if you could surge your service organization and your sales organization without hiring more people?” That’s the promise of Agentforce. And what if? Imagine the herd leader having to be always the alpha, always on guard, always in protective mode. When does he slngeep, eat, rest, and recuperate? Definitely not by bringing in another herd leader. The two inevitably come to arms each excerting their dominance until one is run off by the other, to survive on his own. The herd leader needs to clone himself, create additional herd, or corporate, assets to help him do his job better. Enter the power behind Salesforce’s long history with Artificial Intelligence. The effectiveness of Salesforce’s tools in delivering a return on investment remains to be seen, especially as many businesses struggle to evaluate the success of generative AI. Nonetheless, Salesforce poses a significant challenge to newer firms like OpenAI and Anthropic, which have privately acknowledged their use of Salesforce’s CRM software. For many chief innovation officers, it’s easier to continue leveraging Salesforce’s existing platform rather than adopt new technologies. Like the healthiest of the band of Mustangs, the most skilled and aggressive will thrive and survive. Salesforce’s established presence and broad distribution put it in a strong position at a time when large companies are often hesitant to embrace new tech. Its fearless approach to job displacement suggests the company is poised to profit significantly from its AI venture. As a result, Salesforce may well become a formidable competitor in the AI world. Furthermore taking its own investment in AI education to new heights, one can believe that Salesforce has an eye on people and not just profits. Much like the lead stallion in a wild herd, Salesforce is protecting itself and its biggest asset, its people! By Tectonic’s Salesforce Solutions Architect, Shannan Hearne Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Data Cloud and Genpact

6 Work Easing Google AI Extensions

Time is often equated with money, and that couldn’t be more accurate. These Work Easing Google AI Extensions will give you back some time. Time is an invaluable asset, essential for managing both work and life. Many find themselves overwhelmed with tasks, constantly battling stress and exhaustion due to a lack of time. This was the case until AI-powered Chrome extensions came along, revolutionizing the way work is approached. Time concious workers have discovered the transformative power of these digital tools. They not only improved their careers but also completely changed how each manages there workload. Now, Tectonic’s sharing these invaluable resources, which have significantly boosted her productivity and time management. Here are some of the AI Chrome extensions we recommend: 1. Glasp AI Extension Glasp AI Chrome Extension Glasp AI is a robust tool that aids in capturing, organizing, and sharing knowledge from the web. Features: Benefits: 2. Merlin AI Merlin AI Chrome Extension Merlin AI uses advanced technology to assist with various tasks, helping users work faster and more efficiently. What it does: Why it’s helpful: 3. Just Done AI Extension Just Done AI Just Done AI is a great tool for detecting AI content. Features: Benefits: 4. Perplexity AI Perplexity AI Chrome Extension Perplexity AI is designed to elevate the browsing experience by providing quick access to AI-driven information and assistance. Why it’s useful: Benefits: 5. SciSpace SciSpace Chrome Extension SciSpace simplifies the understanding of scientific papers or reports by acting as a virtual assistant within the web browser. Just imagine having a coach explaining the technical concepts. Features: 6. WebChatGPT WebChatGPT Chrome Extension WebChatGPT enhances ChatGPT by allowing it to access current information from the internet, making the tool even more effective, by connecting it to up-to-date data. What it does: Why it’s beneficial: Work Easing Google AI Extensions These AI Chrome extensions have dramatically improved time management and productivity. They offer similar potential for anyone looking to optimize their workflow and make the most of their work time. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Large and Small Language Models

Large and Small Language Models

Understanding Language Models in AI Language models are sophisticated AI systems designed to generate natural human language, a task that is far from simple. These models operate as probabilistic machine learning systems, predicting the likelihood of word sequences to emulate human-like intelligence. In the scientific realm, the focus of language models has been twofold: While today’s cutting-edge AI models in Natural Language Processing (NLP) are impressive, they have not yet fully passed the Turing Test—a benchmark where a machine’s communication is indistinguishable from that of a human. The Emergence of Language Models We are approaching this milestone with advancements in Large Language Models (LLMs) and the promising but less discussed Small Language Models (SLMs). Large Language Models compared to Small Language Models LLMs like ChatGPT have garnered significant attention due to their ability to handle complex interactions and provide insightful responses. These models distill vast amounts of internet data into concise and relevant information, offering an alternative to traditional search methods. Conversely, SLMs, such as Mistral 7B, while less flashy, are valuable for specific applications. They typically contain fewer parameters and focus on specialized domains, providing targeted expertise without the broad capabilities of LLMs. How LLMs Work Comparing LLMs and SLMs Choosing the Right Language Model The decision between LLMs and SLMs depends on your specific needs and available resources. LLMs are well-suited for broad applications like chatbots and customer support. In contrast, SLMs are ideal for specialized tasks in fields such as medicine, law, and finance, where domain-specific knowledge is crucial. Large and Small Language Models’ Roles Language models are powerful tools that, depending on their size and focus, can either provide broad capabilities or specialized expertise. Understanding their strengths and limitations helps in selecting the right model for your use case. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Outlook Offline Access

Outlook Offline Access

The upcoming Outlook app update will introduce offline access, allowing users to open the app without needing an internet connection, a feature previously unavailable. Offline access was limited to instances where the app was already running and lost its connection. Additionally, starting this December, the app will automatically synchronize calendars when transitioning between the classic and new versions of Outlook. Outlook Offline Access. Teams is also receiving several enhancements. In November, Android and iOS users will benefit from a new video feature called Cloud IntelliFrame. This technology improves the visibility of participants during video meetings by optimizing video framing, and will be available for mobile users joining meetings with Teams Rooms on Windows. For Teams users on laptops, Microsoft is introducing a feature that simplifies the use of shared meeting room devices. When a user connects their laptop to a Teams meeting room via USB, the tool will automatically detect the room’s audio settings. A pre-join screen will then prompt the user to connect, enhancing the BYOD (Bring Your Own Device) experience. This functionality supports various meeting room devices, such as screens and audio equipment, provided they are on a Microsoft-supported list. Mac users will also see improvements in Teams next month with Microsoft Edge. If Edge is set as the default browser and the feature is activated, web links from the Teams app will automatically open in the same profile used to log into Teams. This streamlines the process by eliminating the need for additional logins, making it quicker to access links from chats, channels, and other areas. Administrators can control this functionality through the “Choose Which Browser Opens Web Links” policy in Microsoft 365. Additionally, several new features for Microsoft 365, including updates to Microsoft Viva and SharePoint, will be rolling out soon. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
AI and Big Data

AI and Big Data

Over the past decade, enterprises have accumulated vast amounts of data, capturing everything from business processes to inventory statistics. This surge in data marked the onset of the big data revolution. However, merely storing and managing big data is no longer sufficient to extract its full value. As organizations become adept at handling big data, forward-thinking companies are now leveraging advanced analytics and the latest AI and machine learning techniques to unlock even greater insights. These technologies can identify patterns and provide cognitive capabilities across vast datasets, enabling organizations to elevate their data analytics to new levels. Additionally, the adoption of generative AI systems is on the rise, offering more conversational approaches to data analysis and enhancement. This allows organizations to extract significant insights from information that would otherwise remain untapped in data stores. How Are AI and Big Data Related? Applying machine learning algorithms to big data is a logical progression for companies aiming to maximize the potential of their data. Unlike traditional rules-based approaches that follow explicit instructions, machine learning systems use data-driven algorithms and statistical models to analyze and detect patterns in data. Big data serves as the raw material for these systems, which derive valuable insights from it. Organizations are increasingly recognizing the benefits of integrating big data with machine learning. However, to fully harness the power of both, it’s crucial to understand their individual capabilities. Understanding Big Data Big data involves extracting and analyzing information from large quantities of data, but volume is just one aspect. Other critical “Vs” of big data that enterprises must manage include velocity, variety, veracity, validity, visualization, and value. Understanding Machine Learning Machine learning, the backbone of modern AI, adds significant value to big data applications by deriving deeper insights. These systems learn and adapt over time without the need for explicit programming, using statistical models to analyze and infer patterns from data. Historically, companies relied on complex, rules-based systems for reporting, which often proved inflexible and unable to cope with constant changes. Today, machine learning and deep learning enable systems to learn from big data, enhancing decision-making, business intelligence, and predictive analysis. The strength of machine learning lies in its ability to discover patterns in data. The more data available, the more these algorithms can identify patterns and apply them to future data. Applications range from recommendation systems and anomaly detection to image recognition and natural language processing (NLP). Categories of Machine Learning Algorithms Machine learning algorithms generally fall into three categories: The most powerful large language models (LLMs), which underpin today’s widely used generative AI systems, utilize a combination of these methods, learning from massive datasets. Understanding Generative AI Generative AI models are among the most powerful and popular AI applications, creating new data based on patterns learned from extensive training datasets. These models, which interact with users through conversational interfaces, are trained on vast amounts of internet data, including conversations, interviews, and social media posts. With pre-trained LLMs, users can generate new text, images, audio, and other outputs using natural language prompts, without the need for coding or specialized models. How Does AI Benefit Big Data? AI, combined with big data, is transforming businesses across various sectors. Key benefits include: Big Data and Machine Learning: A Synergistic Relationship Big data and machine learning are not competing concepts; when combined, they deliver remarkable results. Emerging big data techniques offer powerful ways to manage and analyze data, while machine learning models extract valuable insights from it. Successfully handling the various “Vs” of big data enhances the accuracy and power of machine learning models, leading to better business outcomes. The volume of data is expected to grow exponentially, with predictions of over 660 zettabytes of data worldwide by 2030. As data continues to amass, machine learning will become increasingly reliant on big data, and companies that fail to leverage this combination will struggle to keep up. Examples of AI and Big Data in Action Many organizations are already harnessing the power of machine learning-enhanced big data analytics: Conclusion The integration of AI and big data is crucial for organizations seeking to drive digital transformation and gain a competitive edge. As companies continue to combine these technologies, they will unlock new opportunities for personalization, efficiency, and innovation, ensuring they remain at the forefront of their industries. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Connected Vehicles

Connected Vehicles

Revolutionizing the Automotive Industry: Salesforce’s Connected Car App The automotive industry has always been a beacon of innovation, consistently pushing the boundaries to enhance the driving experience. From the iconic Model T and the assembly line to today’s electric and autonomous vehicles, the evolution of automobiles has been driven by an unyielding pursuit of progress. I actually purchased a new-to-me car today, and with the connected vehicle on the horizon I’m kind of glad I’ll be able to upgrade in a couple years. Bluetooth and back up cameras are great. But a car that can tell the dealership to get me on the horn before some automotive calamity occurs? The future is here, my friends. Connected Vehicles for Better Experiences Now, as digital transformation reshapes industries, a new chapter is emerging in automotive innovation: the connected car. Leading this charge is Salesforce, a global powerhouse in customer relationship management (CRM), with the introduction of its groundbreaking Connected Car App, poised to redefine in-car experiences for both drivers and passengers. From my personal buying experience today, the car business could use some customer relationship management! The Future of In-Car Connectivity Salesforce’s Connected Car App is more than just a technological enhancement; it represents a fundamental shift in how we interact with our vehicles. By leveraging Salesforce’s Customer 360 platform, this app creates personalized, engaging experiences that go far beyond traditional automotive features. The Connected Car App is designed to make every journey more intuitive and efficient, offering real-time insights and services tailored to the unique needs of each driver. Whether it’s maintenance alerts, optimized route suggestions based on traffic, or personalized entertainment options, the app transforms the car into a truly smart companion on the road. A GPS feature? I guess I can plan on deleting Waze off my phone in the near future! Powered by Salesforce Customer 360 At the heart of the Connected Car App is Salesforce’s Customer 360 platform, which delivers a comprehensive, 360-degree view of each customer. This integration ensures that the app provides tailored experiences based on a deep understanding of the driver’s preferences, habits, and history. It isn’t going to just know you by a vehicle loan number, a VIN number, or even just an email address. For instance, a driver who frequently takes long road trips might receive customized recommendations for rest stops, dining options, and attractions along their route. Meanwhile, commuters could benefit from real-time updates on traffic, weather, and parking availability. The app’s ability to anticipate and respond to the driver’s needs in real time distinguishes it from traditional in-car systems. I can just hear my car now, advising me it has been one hour since I stopped for coffee, and she’s worried about my sanity. Enhancing Customer Loyalty and Satisfaction with Connected Vehicles The Connected Car App offers significant potential to boost customer loyalty and satisfaction. By delivering a personalized driving experience, automakers can strengthen relationships with customers, transforming each driving journey into an opportunity to build brand loyalty. If Toyota is suddenly going to treat me like Shannan Hearne instead of customer # xxxxx would be ecstatic. Additionally, the app’s capability to collect and analyze data in real time opens new avenues for automakers to engage with their customers. Predictive maintenance reminders, targeted promotions, and special offers are just a few examples of how the app fosters a deeper connection between the brand and the driver. Oh, yeah. My connected vehicle app is DEFINITELY going to be talking to me about changing my oil (I’m not exactly diligent), how great the latest model of Toyota is (I drove a Corolla for 18 years and have also owned a Tacoma, a Tundra, and a Prius), and if it would add coffee coupons I would be golden. A New Era of Automotive Innovation Salesforce’s Connected Car App marks a pivotal moment in the automotive industry’s digital transformation. As vehicles become increasingly connected, the opportunities for innovation are boundless. Salesforce is at the forefront with a solution that not only enhances the driving experience but also empowers automakers to build stronger, more meaningful relationships with their customers. In a world where customer expectations are constantly growing, the Connected Car App is a game-changer. Customers, even car owners, expect their brands to know them and recognize them. By integrating Salesforce’s CRM capabilities directly into vehicles, the app creates a seamless, personalized experience that stands out. As we look ahead, it’s clear that the Connected Car App is just the beginning of an exciting new era of automotive innovation. As a marketer at heart and a technologist by trade, I’m really excited about the potential here. Connected Vehicle: A Unified Digital Foundation Salesforce’s Connected Vehicle platform provides automakers with a unified, intelligent digital foundation, enabling them to reduce development time and roll out features and updates faster than ever before. This platform allows seamless integration of vehicle, Internet of Things (IoT), driver, and retail data from various sources, including AWS IoT FleetWise and Snapdragon® Car-to-Cloud Connected Services Platform, to enhance driver experiences and ensure smooth vehicle operation. Can you imagine a smart app like the Connected Vehicle talking to your loyalty apps for gas stations, convenience stores, and grocery stores? I would be driving down the interstate and the app will tell me there is a Starbucks ahead AND I have a 10% off coupon. Automakers and mobility leaders like Sony Honda Mobility are already exploring the use of Connected Vehicle to deliver better experiences for their customers. The platform’s ability to access and integrate data from any source in near real time allows automakers to personalize driver experiences, in-car offers, and safety upgrades. Why It Matters By 2030, every new vehicle sold will be connected, and the advanced, tech-driven features they provide will be increasingly important to consumers. A recent Salesforce survey revealed that drivers already consider connected features to be nearly as important as a car’s brand. Connected Vehicle accelerates this evolution, enabling automakers to immediately deliver branded, customized experiences tailored to

Read More
Collabrate With AI

Collabrate With AI

Many artists, writers, musicians, and creators are facing fears that AI is taking over their jobs. On the surface, generative AI tools can replicate work in moments that previously took creators hours to produce—often at a fraction of the cost and with similar quality. This shift has led many businesses to adopt AI for content creation, leaving creators worried about their livelihoods. Yet, there’s another way to view this situation, one that offers hope to creators everywhere. AI, at its core, is a tool of mimicry. When provided with enough data, it can replicate a style or subject with reasonable accuracy. Most of this data has been scraped from the internet, often without explicit consent, to train AI models on a wide variety of creative outputs. If you’re a creator, it’s likely that pieces of your work have contributed to the training of these AI models. Your art, words, and ideas have helped shape what these systems now consider ‘good’ in the realms of art, music, and writing. AI can combine the styles of multiple creators to generate something new, but often these creations fall flat. Why? While image-generating AI can predict pixels, it lacks an understanding of human emotions. It knows what a smile looks like but can’t grasp the underlying feelings of joy, nervousness, or flirtation that make a smile truly meaningful. AI can only generate a superficial replica unless the creator uses extensive prompt engineering to convey the context behind that smile. Emotion is uniquely human, and it’s what makes our creations resonate with others. A single brushstroke from a human artist can convey emotions that might take thousands of words to replicate through an AI prompt. We’ve all heard the saying, “A picture is worth a thousand words.” But generating that picture with AI often takes many more words. Input a short prompt, and the AI will enhance it with more words, often leading to results that stray from your original vision. To achieve a specific outcome, you may need hours of prompt engineering, trial, and error—and even then, the result might not be quite right. Without a human artist to guide the process, these generated works will often remain unimpressive, no matter how advanced the technology becomes. That’s where you, the creator, come in. By introducing your own inputs, such as images or sketches, and using workflows like those in ComfyUI, you can exert more control over the outputs. AI becomes less of a replacement for the artist and more of a tool or collaborator. It can help speed up the creative process but still relies on the artist’s hand to guide it toward a meaningful result. Artists like Martin Nebelong have embraced this approach, treating AI as just another tool in their creative toolbox. Nebelong uses high levels of control in AI-driven workflows to create works imbued with his personal emotional touch. He shares these workflows on platforms like LinkedIn and Twitter, encouraging other creators to explore how AI can speed up their processes while retaining the unique artistry that only humans can provide. Nebelong’s philosophy is clear: “I’m pro-creativity, pro-art, and pro-AI. Our tools change, the scope of what we can do changes. I don’t think creative AI tools or models have found their best form yet; they’re flawed, raw, and difficult to control. But I’m excited for when they find that form and can act as an extension of our hands, our brush, and as an amplifier of our artistic intent.” AI can help bring an artist 80% of the way to a finished product, but it’s the final 20%—the part where human skill and emotional depth come in—that elevates the piece to something truly remarkable. Think about the notorious issues with AI-generated hands. Often, the output features too many fingers or impossible poses, a telltale sign of AI’s limitations. An artist is still needed to refine the details, correct mistakes, and bring the creation in line with reality. While using AI may be faster than organizing a full photoshoot or painting from scratch, the artist’s role has shifted from full authorship to that of a collaborator, guiding AI toward a polished result. Nebelong often starts with his own artwork and integrates AI-generated elements, using them to enhance but never fully replace his vision. He might even use AI to generate 3D models, lighting, or animations, but the result is always driven by his creativity. For him, AI is just another step in the creative journey, not a shortcut or replacement for human effort. However, AI’s ability to replicate the styles of famous artists and public figures raises ethical concerns. With platforms like CIVIT.AI making it easy to train models on any style or subject, questions arise about the legality and morality of using someone else’s likeness or work without permission. As regulations catch up, we may see a future where AI models trained on specific styles or individuals are licensed, allowing creators to retain control over their works in the same way they license their traditional creations today. The future may also see businesses licensing AI models trained on actors, artists, or styles, allowing them to produce campaigns without booking the actual talent. This would lower costs while still benefiting creators through licensing fees. Actors and artists could continue to contribute their talents long after they’ve retired, or even passed on, by licensing their digital likenesses, as seen with CGI performances in movies like Rogue One. In conclusion, AI is pushing creators to learn new skills and adapt to new tools. While this can feel daunting, it’s important to remember that AI is just that—a tool. It doesn’t understand emotion, intent, or meaning, and it never will. That’s where humans come in. By guiding AI with our creativity and emotional depth, we can produce works that resonate with others on a deeper level. For example, you can tell artificial intelligence what an image should look like but not what emotions the image should evoke. Creators, your job isn’t disappearing. It’s

Read More
Small Language Models

Small Language Models

Large language models (LLMs) like OpenAI’s GPT-4 have gained acclaim for their versatility across various tasks, but they come with significant resource demands. In response, the AI industry is shifting focus towards smaller, task-specific models designed to be more efficient. Microsoft, alongside other tech giants, is investing in these smaller models. Science often involves breaking complex systems down into their simplest forms to understand their behavior. This reductionist approach is now being applied to AI, with the goal of creating smaller models tailored for specific functions. Sébastien Bubeck, Microsoft’s VP of generative AI, highlights this trend: “You have this miraculous object, but what exactly was needed for this miracle to happen; what are the basic ingredients that are necessary?” In recent years, the proliferation of LLMs like ChatGPT, Gemini, and Claude has been remarkable. However, smaller language models (SLMs) are gaining traction as a more resource-efficient alternative. Despite their smaller size, SLMs promise substantial benefits to businesses. Microsoft introduced Phi-1 in June last year, a smaller model aimed at aiding Python coding. This was followed by Phi-2 and Phi-3, which, though larger than Phi-1, are still much smaller than leading LLMs. For comparison, Phi-3-medium has 14 billion parameters, while GPT-4 is estimated to have 1.76 trillion parameters—about 125 times more. Microsoft touts the Phi-3 models as “the most capable and cost-effective small language models available.” Microsoft’s shift towards SLMs reflects a belief that the dominance of a few large models will give way to a more diverse ecosystem of smaller, specialized models. For instance, an SLM designed specifically for analyzing consumer behavior might be more effective for targeted advertising than a broad, general-purpose model trained on the entire internet. SLMs excel in their focused training on specific domains. “The whole fine-tuning process … is highly specialized for specific use-cases,” explains Silvio Savarese, Chief Scientist at Salesforce, another company advancing SLMs. To illustrate, using a specialized screwdriver for a home repair project is more practical than a multifunction tool that’s more expensive and less focused. This trend towards SLMs reflects a broader shift in the AI industry from hype to practical application. As Brian Yamada of VLM notes, “As we move into the operationalization phase of this AI era, small will be the new big.” Smaller, specialized models or combinations of models will address specific needs, saving time and resources. Some voices express concern over the dominance of a few large models, with figures like Jack Dorsey advocating for a diverse marketplace of algorithms. Philippe Krakowski of IPG also worries that relying on the same models might stifle creativity. SLMs offer the advantage of lower costs, both in development and operation. Microsoft’s Bubeck emphasizes that SLMs are “several orders of magnitude cheaper” than larger models. Typically, SLMs operate with around three to four billion parameters, making them feasible for deployment on devices like smartphones. However, smaller models come with trade-offs. Fewer parameters mean reduced capabilities. “You have to find the right balance between the intelligence that you need versus the cost,” Bubeck acknowledges. Salesforce’s Savarese views SLMs as a step towards a new form of AI, characterized by “agents” capable of performing specific tasks and executing plans autonomously. This vision of AI agents goes beyond today’s chatbots, which can generate travel itineraries but not take action on your behalf. Salesforce recently introduced a 1 billion-parameter SLM that reportedly outperforms some LLMs on targeted tasks. Salesforce CEO Mark Benioff celebrated this advancement, proclaiming, “On-device agentic AI is here!” Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
SearchGPT and Knowledge Cutoff

SearchGPT and Knowledge Cutoff

Tackling the Knowledge Cutoff Challenge in Generative AI In the realm of generative AI, a significant hurdle has been the issue of knowledge cutoff—where a large language model (LLM) only has information up until a specific date. This was an early concern with OpenAI’s ChatGPT. For example, the GPT-4o model that currently powers ChatGPT has a knowledge cutoff in October 2023. The older GPT-4 model, on the other hand, had a cutoff in September 2021. Traditional search engines like Google, however, don’t face this limitation. Google continuously crawls the internet to keep its index up to date with the latest information. To address the knowledge cutoff issue in LLMs, multiple vendors, including OpenAI, are exploring search capabilities powered by generative AI (GenAI). Introducing SearchGPT: OpenAI’s GenAI Search Engine SearchGPT is OpenAI’s GenAI search engine, first announced on July 26, 2024. It aims to combine the strengths of a traditional search engine with the capabilities of GPT LLMs, eliminating the knowledge cutoff by drawing real-time data from the web. SearchGPT is currently a prototype, available to a limited group of test users, including individuals and publishers. OpenAI has invited publishers to ensure their content is accurately represented in search results. The service is positioned as a temporary offering to test and evaluate its performance. Once this evaluation phase is complete, OpenAI plans to integrate SearchGPT’s functionality directly into the ChatGPT interface. As of August 2024, OpenAI has not announced when SearchGPT will be generally available or integrated into the main ChatGPT experience. Key Features of SearchGPT SearchGPT offers several features designed to enhance the capabilities of ChatGPT: OpenAI’s Challenge to Google Search Google has long dominated the search engine landscape, a position that OpenAI aims to challenge with SearchGPT. Answers, Not Links Traditional search engines like Google act primarily as indexes, pointing users to other sources of information rather than directly providing answers. Google has introduced AI Overviews (formerly Search Generative Experience or SGE) to offer AI-generated summaries, but it still relies heavily on linking to third-party websites. SearchGPT aims to change this by providing direct answers to user queries, summarizing the source material instead of merely pointing to it. Contextual Continuity In contrast to Google’s point-in-time search queries, where each query is independent, SearchGPT strives to maintain context across multiple queries, offering a more seamless and coherent search experience. Search Accuracy Google Search often depends on keyword matching, which can require users to sift through several pages to find relevant information. SearchGPT aims to combine real-time data with an LLM to deliver more contextually accurate and relevant information. Ad-Free Experience SearchGPT offers an ad-free interface, providing a cleaner and more user-friendly experience compared to Google, which includes ads in its search results. AI-Powered Search Engine Comparison Here’s a comparison of the AI-powered search engines available today: Search Engine Platform Integration Publisher Collaboration Ads Cost SearchGPT (OpenAI) Standalone prototype Strong emphasis Ad-free Free (prototype stage) Google SGE Built on Google’s infrastructure SEO practices, content partnerships Includes ads Free Microsoft Bing AI/Copilot Built on Microsoft’s infrastructure SEO practices, content partnerships Includes ads Free Perplexity AI Standalone Basic source attribution Ad-free Free; $20/month for premium You.com AI assistant with various modes Basic source attribution Ad-free Free; premium tiers available Brave Search Independent search index Basic source attribution Ad-free Free Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
Box Acquires Alphamoon

Box Acquires Alphamoon

Box Inc. has acquired Alphamoon to enhance its intelligent document processing (IDP) capabilities and its enterprise knowledge management AI platform. Now that Box acquires Alphamoon, it will imr improves IDP. Box Acquires Alphamoon IDP goes beyond traditional optical character recognition (OCR) by applying AI to scanned paper documents and unstructured PDFs. While AI technologies like natural language processing (NLP), workflow automation, and document structure recognition have been around for some time, Alphamoon introduces generative AI (GenAI) into the mix, providing advanced capabilities. According to Rand Wacker, Vice President of AI Product Strategy at Box, the integration of GenAI helps not only with summarizing and extracting content from documents but also with recognizing document structures and categorizing them. GenAI works alongside existing OCR and NLP tools, making the digital conversion of paper documents more accurate. Box Acquires Alphamoon – Not LLM Although Box hasn’t acquired a large language model (LLM) outright, it has gained a toolkit that will enhance its Box AI platform. Box AI already uses retrieval-augmented generation to combine a user’s content with external LLMs, ensuring data security while training Box AI to better recognize and categorize documents. Alphamoon’s technology will further refine this process, enabling administrators to create tools more efficiently within the Box ecosystem. “For example, if Alphamoon’s OCR misreads or misextracts something, the system can adjust that specific part and feed it back into the LLM,” Wacker explained. “This approach is powered by an LLM, but it’s specifically trained to understand the documents it encounters, rather than relying on generic content from the internet.” Previewing an upcoming report from Deep Analysis, founder Alan Pelz-Sharpe shared that a survey of 500 enterprises across various industries, including financial services, manufacturing, healthcare, and government, revealed that 53% of enterprise documents still exist on paper. This highlights the need for Box users to have more precise tools to digitize contracts, letters, invoices, faxes, and other paper-based documents. Alphamoon’s generative AI-driven IDP solution allows for human oversight to ensure that attributes are correctly imported from the original documents. Pelz-Sharpe noted that IDP is challenging, but AI has made significant advancements, especially in handling imperfections like crumpled paper, coffee stains, and handwriting. He added that this acquisition addresses a critical gap for Box, which previously relied on partners for these capabilities. Box Buys Alphamoon – Integration Box plans to integrate Alphamoon’s tools into its platform later this year, with deeper integrations expected next year. These will include no-code app-building capabilities related to another acquisition, Crooze, as well as Box Relay’s forms and document generation tools. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
ChatGPT Word Choices

ChatGPT Word Choices

Why Does ChatGPT Use the Word “Delve” So Much? Mystery Solved. The mystery behind ChatGPT’s frequent use of the word “delve” (one of the 10 most common words it uses) has finally been unraveled, and the answer is quite unexpected. Why ChatGPT Word Choices are repetitive. While “delve” and other words like “tapestry” aren’t common in everyday conversations, ChatGPT seems to favor them. You may have noticed this tendency in its outputs. The sudden rise in the use of “delve” in medical papers from March 2024, coincides with the first full year of ChatGPT’s widespread use. “Delve,” along with phrases like “as an AI language model…,” has become a hallmark of ChatGPT’s language, almost a giveaway that a text is AI-generated. But why does ChatGPT overuse “delve”? If it’s trained on human data, how did it develop this preference? Is it emergent behavior? And why “delve” specifically? A Guardian article, “How Cheap, Outsourced Labour in Africa is Shaping AI English,” provides a clue. The key lies in how ChatGPT was built. Why “Delve” So Much? The overuse of “delve” suggests ChatGPT’s language might have been influenced after its initial training on internet data. After training on a massive corpus of data, an additional supervised learning step is used to align the AI’s behavior. Human annotators evaluate the AI’s outputs, and their feedback fine-tunes the model. Here’s a summary of the process: This iterative process involves human feedback to improve the AI’s responses, ensuring it stays aligned and useful. However, this feedback is often provided by a workforce in the global south, where English-speaking annotators are more affordable. In Nigeria, “delve” is more commonly used in business English than in the US or UK. Annotators from these regions provided examples using their familiar language, influencing the AI to adopt a slightly African English style. This is an example of poor sampling, where the evaluators’ language differs from that of the target users, introducing a bias in the writing style. This bias likely stems from the RLHF step rather than the initial training. ChatGPT’s writing style, with or without “delve,” is already somewhat robotic and easy to detect. Understanding these potential pitfalls helps us avoid similar issues in future AI development. Making ChatGPT More Human-Like To make ChatGPT sound more human and avoid overused words like “delve,” consider these Prompt Engineering approaches: These methods can be time-consuming. Ideally, a quick, reliable tool, like a Chrome extension, would streamline this process. If you’ve found a solution or a reliable tool for this issue, share it below in the comments. This is a widespread challenge that many users face. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
State of AI

State of AI

With the Dreamforce conference just a few weeks away, AI is set to be a central theme once again. This week, Salesforce offered a preview of what to expect in September with the release of its “Trends in AI for CRM” report. This report consolidates findings from several Salesforce research studies conducted from February last year to April this year. The report’s executive summary highlights four key insights: The Fear of Missing Out (FOMO) An intriguing statistic from Salesforce’s “State of Data and Analytics” report reveals that 77% of business leaders feel a fear of missing out on generative AI. This concern is particularly pronounced among marketers (88%), followed by sales executives (78%) and customer service professionals (73%). Given the continued hype around generative AI, these numbers are likely still relevant or even higher as of July 2024. As Salesforce AI CEO Clara Shih puts it: “The majority of business executives fear they’re missing out on AI’s benefits, and it’s a well-founded concern. Today’s technology world is reminiscent of 1998 for the Internet—full of opportunities but also hype.” Shih adds: “How do we separate the signal from the noise and identify high-impact enterprise use cases?” The Quest for ROI and Value The surge of hype around generative AI over the past 18 months has led to high expectations. While Salesforce has been more responsible in managing user expectations, many executives view generative AI as a cure-all. However, this perspective can be problematic, as “silver bullets” often miss their mark. Recent tech sector developments reflect a shift toward a longer-term view of AI’s impact. Meta’s share price fell when Mark Zuckerberg emphasized AI as a multi-year project, and Alphabet’s Sundar Pichai faced tough questions from Wall Street about the need for continued investment. State of AI Shih notes a growing impatience with the time required to realize AI’s value: “It’s been over 18 months since ChatGPT sparked excitement about AI in business. Many companies are still grappling with building or buying solutions that are not overly siloed and can be customized. The challenge is finding a balance between quick implementation and configurability.” She adds: “The initial belief was that companies could just integrate ChatGPT and see instant transformation. However, there are security risks and practical challenges. For LLMs to be effective, they need contextual data about users and customers.” Conclusion: A Return to the Future Shih likens the current AI landscape to the late 90s Internet boom, noting: “It’s similar to the late 90s when people questioned if the Internet was overhyped. While some investments will not pan out, the transformative potential of successful use cases is enormous. Just as with the Internet, discovering the truly valuable applications of AI may require experimentation and time. We are very much in the 1998 moment for AI now.” Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Rold of Small Language Models

Role of Small Language Models

The Role of Small Language Models (SLMs) in AI While much attention is often given to the capabilities of Large Language Models (LLMs), Small Language Models (SLMs) play a vital role in the AI landscape. Role of Small Language Models. Large vs. Small Language Models LLMs, like GPT-4, excel at managing complex tasks and providing sophisticated responses. However, their substantial computational and energy requirements can make them impractical for smaller organizations and devices with limited processing power. In contrast, SLMs offer a more feasible solution. Designed to be lightweight and resource-efficient, SLMs are ideal for applications operating in constrained computational environments. Their reduced resource demands make them easier and quicker to deploy, while also simplifying maintenance. What are Small Language Models? Small Language Models (SLMs) are neural networks engineered to generate natural language text. The term “small” refers not only to the model’s physical size but also to its parameter count, neural architecture, and the volume of data used during training. Parameters are numeric values that guide a model’s interpretation of inputs and output generation. Models with fewer parameters are inherently simpler, requiring less training data and computational power. Generally, models with fewer than 100 million parameters are classified as small, though some experts consider models with as few as 1 million to 10 million parameters to be small in comparison to today’s large models, which can have hundreds of billions of parameters. How Small Language Models Work SLMs achieve efficiency and effectiveness with a reduced parameter count, typically ranging from tens to hundreds of millions, as opposed to the billions seen in larger models. This design choice enhances computational efficiency and task-specific performance while maintaining strong language comprehension and generation capabilities. Techniques such as model compression, knowledge distillation, and transfer learning are critical for optimizing SLMs. These methods enable SLMs to encapsulate the broad understanding capabilities of larger models into a more concentrated, domain-specific toolset, facilitating precise and effective applications while preserving high performance. Advantages of Small Language Models Applications of Small Language Models Role of Small Language Models is lengthy. SLMs have seen increased adoption due to their ability to produce contextually coherent responses across various applications: Small Language Models vs. Large Language Models Feature LLMs SLMs Training Dataset Broad, diverse internet data Focused, domain-specific data Parameter Count Billions Tens to hundreds of millions Computational Demand High Low Cost Expensive Cost-effective Customization Limited, general-purpose High, tailored to specific needs Latency Higher Lower Security Risk of data exposure through APIs Lower risk, often not open source Maintenance Complex Easier Deployment Requires substantial infrastructure Suitable for limited hardware environments Application Broad, including complex tasks Specific, domain-focused tasks Accuracy in Specific Domains Potentially less accurate due to general training High accuracy with domain-specific training Real-time Application Less ideal due to latency Ideal due to low latency Bias and Errors Higher risk of biases and factual errors Reduced risk due to focused training Development Cycles Slower Faster Conclusion The role of Small Language Models (SLMs) is increasingly significant as they offer a practical and efficient alternative to larger models. By focusing on specific needs and operating within constrained environments, SLMs provide targeted precision, cost savings, improved security, and quick responsiveness. As industries continue to integrate AI solutions, the tailored capabilities of SLMs are set to drive innovation and efficiency across various domains. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
gettectonic.com