Alphabet Archives - gettectonic.com
Where LLMs Fall Short

LLM Economies

Throughout history, disruptive technologies have been the catalyst for major social and economic revolutions. The invention of the plow and irrigation systems 12,000 years ago sparked the Agricultural Revolution, while Johannes Gutenberg’s 15th-century printing press fueled the Protestant Reformation and helped propel Europe out of the Middle Ages into the Renaissance. In the 18th century, James Watt’s steam engine ushered in the Industrial Revolution. More recently, the internet has revolutionized communication, commerce, and information access, shrinking the world into a global village. Similarly, smartphones have transformed how people interact with their surroundings. Now, we stand at the dawn of the AI revolution. Large Language Models (LLMs) represent a monumental leap forward, with significant economic implications at both macro and micro levels. These models are reshaping global markets, driving new forms of currency, and creating a novel economic landscape. The reason LLMs are transforming industries and redefining economies is simple: they automate both routine and complex tasks that traditionally require human intelligence. They enhance decision-making processes, boost productivity, and facilitate cost reductions across various sectors. This enables organizations to allocate human resources toward more creative and strategic endeavors, resulting in the development of new products and services. From healthcare to finance to customer service, LLMs are creating new markets and driving AI-driven services like content generation and conversational assistants into the mainstream. To truly grasp the engine driving this new global economy, it’s essential to understand the inner workings of this disruptive technology. These posts will provide both a macro-level overview of the economic forces at play and a deep dive into the technical mechanics of LLMs, equipping you with a comprehensive understanding of the revolution happening now. Why Now? The Connection Between Language and Human Intelligence AI did not begin with ChatGPT’s arrival in November 2022. Many people were developing machine learning classification models in 1999, and the roots of AI go back even further. Artificial Intelligence was formally born in 1950, when Alan Turing—considered the father of theoretical computer science and famed for cracking the Nazi Enigma code during World War II—created the first formal definition of intelligence. This definition, known as the Turing Test, demonstrated the potential for machines to exhibit human-like intelligence through natural language conversations. The test involves a human evaluator who engages in conversations with both a human and a machine. If the evaluator cannot reliably distinguish between the two, the machine is considered to have passed the test. Remarkably, after 72 years of gradual AI development, ChatGPT simulated this very interaction, passing the Turing Test and igniting the current AI explosion. But why is language so closely tied to human intelligence, rather than, for example, vision? While 70% of our brain’s neurons are devoted to vision, OpenAI’s pioneering image generation model, DALL-E, did not trigger the same level of excitement as ChatGPT. The answer lies in the profound role language has played in human evolution. The Evolution of Language The development of language was the turning point in humanity’s rise to dominance on Earth. As Yuval Noah Harari points out in his book Sapiens: A Brief History of Humankind, it was the ability to gossip and discuss abstract concepts that set humans apart from other species. Complex communication, such as gossip, requires a shared, sophisticated language. Human language evolved from primitive cave signs to structured alphabets, which, along with grammar rules, created languages capable of expressing thousands of words. In today’s digital age, language has further evolved with the inclusion of emojis, and now with the advent of GenAI, tokens have become the latest cornerstone in this progression. These shifts highlight the extraordinary journey of human language, from simple symbols to intricate digital representations. In the next post, we will explore the intricacies of LLMs, focusing specifically on tokens. But before that, let’s delve into the economic forces shaping the LLM-driven world. The Forces Shaping the LLM Economy AI Giants in Competition Karl Marx and Friedrich Engels argued that those who control the means of production hold power. The tech giants of today understand that AI is the future means of production, and the race to dominate the LLM market is well underway. This competition is fierce, with industry leaders like OpenAI, Google, Microsoft, and Facebook battling for supremacy. New challengers such as Mistral (France), AI21 (Israel), and Elon Musk’s xAI and Anthropic are also entering the fray. The LLM industry is expanding exponentially, with billions of dollars of investment pouring in. For example, Anthropic has raised $4.5 billion from 43 investors, including major players like Amazon, Google, and Microsoft. The Scarcity of GPUs Just as Bitcoin mining requires vast computational resources, training LLMs demands immense computing power, driving a search for new energy sources. Microsoft’s recent investment in nuclear energy underscores this urgency. At the heart of LLM technology are Graphics Processing Units (GPUs), essential for powering deep neural networks. These GPUs have become scarce and expensive, adding to the competitive tension. Tokens: The New Currency of the LLM Economy Tokens are the currency driving the emerging AI economy. Just as money facilitates transactions in traditional markets, tokens are the foundation of LLM economics. But what exactly are tokens? Tokens are the basic units of text that LLMs process. They can be single characters, parts of words, or entire words. For example, the word “Oscar” might be split into two tokens, “os” and “car.” The performance of LLMs—quality, speed, and cost—hinges on how efficiently they generate these tokens. LLM providers price their services based on token usage, with different rates for input (prompt) and output (completion) tokens. As companies rely more on LLMs, especially for complex tasks like agentic applications, token usage will significantly impact operational costs. With fierce competition and the rise of open-source models like Llama-3.1, the cost of tokens is rapidly decreasing. For instance, OpenAI reduced its GPT-4 pricing by about 80% over the past year and a half. This trend enables companies to expand their portfolio of AI-powered products, further fueling the LLM economy. Context Windows: Expanding Capabilities

Read More
State of AI

State of AI

With the Dreamforce conference just a few weeks away, AI is set to be a central theme once again. This week, Salesforce offered a preview of what to expect in September with the release of its “Trends in AI for CRM” report. This report consolidates findings from several Salesforce research studies conducted from February last year to April this year. The report’s executive summary highlights four key insights: The Fear of Missing Out (FOMO) An intriguing statistic from Salesforce’s “State of Data and Analytics” report reveals that 77% of business leaders feel a fear of missing out on generative AI. This concern is particularly pronounced among marketers (88%), followed by sales executives (78%) and customer service professionals (73%). Given the continued hype around generative AI, these numbers are likely still relevant or even higher as of July 2024. As Salesforce AI CEO Clara Shih puts it: “The majority of business executives fear they’re missing out on AI’s benefits, and it’s a well-founded concern. Today’s technology world is reminiscent of 1998 for the Internet—full of opportunities but also hype.” Shih adds: “How do we separate the signal from the noise and identify high-impact enterprise use cases?” The Quest for ROI and Value The surge of hype around generative AI over the past 18 months has led to high expectations. While Salesforce has been more responsible in managing user expectations, many executives view generative AI as a cure-all. However, this perspective can be problematic, as “silver bullets” often miss their mark. Recent tech sector developments reflect a shift toward a longer-term view of AI’s impact. Meta’s share price fell when Mark Zuckerberg emphasized AI as a multi-year project, and Alphabet’s Sundar Pichai faced tough questions from Wall Street about the need for continued investment. State of AI Shih notes a growing impatience with the time required to realize AI’s value: “It’s been over 18 months since ChatGPT sparked excitement about AI in business. Many companies are still grappling with building or buying solutions that are not overly siloed and can be customized. The challenge is finding a balance between quick implementation and configurability.” She adds: “The initial belief was that companies could just integrate ChatGPT and see instant transformation. However, there are security risks and practical challenges. For LLMs to be effective, they need contextual data about users and customers.” Conclusion: A Return to the Future Shih likens the current AI landscape to the late 90s Internet boom, noting: “It’s similar to the late 90s when people questioned if the Internet was overhyped. While some investments will not pan out, the transformative potential of successful use cases is enormous. Just as with the Internet, discovering the truly valuable applications of AI may require experimentation and time. We are very much in the 1998 moment for AI now.” Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Alphabet Abandons Acquisition for HubSpot

Alphabet Abandons Acquisition for HubSpot

Alphabet Abandons Acquisition Plans for HubSpot The integration between Salesforce and Hubspot could have changed drastically. The HubSpot-Salesforce integration allows you to pass data between HubSpot and Salesforce seamlessly, and maintain consistency between your marketing and sales teams. Current HubSpot Google integration includes: ability to log emails sent from Gmail into HubSpot CRM with one click, so your team spends less time on busy work and more time doing what they do best. HubSpot integrates with your Google Calendar to help you book more meetings in less time. Google parent Alphabet has abandoned its plans to acquire HubSpot, according to sources familiar with the matter. This decision puts to rest what would have been one of the year’s largest takeovers. HubSpot, Inc. is an American developer and marketer of software products for inbound marketing, sales, and customer service. HubSpot was founded by Brian Halligan and Dharmesh Shah in 2006. The talks between Alphabet and HubSpot never progressed to due diligence and fell apart shortly after the companies held initial discussions on a potential deal, the source said, on condition of anonymity to discuss confidential matters. HubSpot’s shares, a customer relationship management company, plummeted by as much as 19 percent on Wednesday (Jul 10) in New York trading, marking the most significant drop since 2020. The shares closed down 12 percent at $492.31, giving the company a market value of approximately $25 billion. Earlier this year, Alphabet had expressed interest in a potential deal with HubSpot. However, the two sides never progressed to detailed discussions or due diligence, said the sources, who requested anonymity due to the confidentiality of the matter. Representatives for Alphabet did not immediately comment. A HubSpot spokesperson also declined to comment. Any acquisition of HubSpot would have been among the largest tech deals of the year, comparable to Synopsys’s pending $34 billion acquisition of Ansys, according to data compiled by Bloomberg. HubSpot recently suffered a hack attack. HubSpot, which builds marketing software for small and medium-sized businesses, has specialized in so-called inbound marketing, where consumers start engagement with a brand. HubSpot customers apply its software to make advertising content that consumers can click on. CEO Yamini Rangan said in May on HubSpot’s financial results call that customer demand had weakened, as small businesses worried about the economic impact of high interest rates. Acquiring Cambridge, Massachusetts-based HubSpot, which caters to small and midsize enterprises, would have bolstered Alphabet’s competitiveness against rivals like Microsoft, Oracle, and Salesforce. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
abc

Alphabet Soup of Cloud Terminology

As with any technology, the cloud brings its own alphabet soup of terms. This insight will hopefully help you navigate your way through the terminology, and provide you the knowledge and power to make the decisions you need to make when considering a new cloud implementation. Here’s the list of terms we will cover in this article: Phew—that’s a lot. Let’s dig in to the definitions and examples to help drive home the meanings of the list of terms above. SaaS (Software as a Service) This is probably the most common implementation of cloud services end users experience. This is software that users access through their web browser. Some software may be installed locally to help augment functionality or provide a richer user experience, but the software installed locally has minimal impact on the user’s computer. Figure 1 provides a high-level overview of this concept. Figure 1 High-level overview of Software as a Service You are probably a user Facebook, Google docs, Office 365, Salesforce, or LinkedIn either at home or at work, so you’ve experienced SaaS first hand and probably for a long time. What SaaS tools are you using outside of those mentioned here? Reach out and let me know—I’m very curious. PaaS (Platform as a Service) PaaS allows a developer to deploy code to an environment that supports their software but they do not have full access to the operating system. In this case the developer has no server responsibility or server access. When I first started writing about cloud technology three years ago, this was kind of primitive service. The provider would just give you access to a folder somewhere on the server with just a bit of documentation and then you were on your own. Now there are tools, such as CloudFoundry, that allow a developer to deploy right from their Integrated Development Environment (IDE) or from a command line production release tool. Then CloudFoundry can take the transmitted release and install it correctly into the cloud environment. With a little trial and error, anyone with a bit of technical skills can deploy to a tool like CloudFoundry where the older style of PaaS took a lot of skill and experience to deploy correctly. IaaS (Infrastructure as a Service) Originally IaaS dealt with a provider giving a user access to a virtual machine located on a system in the provider’s data center. A virtual machine is an operating system that resides in a piece of software on the host computer. Virtual Box, Parallels and VMWare are examples of software that provide virtualization of operating systems called Virtual Machines (VM) Virtualization of servers was all the rage for a while, but when you try to scale within the cloud with multiple virtual servers there are a lot of drawbacks. First, it’s a lot of work to make VMs aware of each other and they don’t always share filesystems and resources easily. Plus, as your needs grow, VMs with a lot of memory and disk space are very expensive, and very often an application on a VM is only using a portion of the OS. For example, if you are deploying a tool that does data aggregation and runs as a service you won’t be taking advantage of the web server that might be running on server too. The issues mentioned in the previous paragraph are common headaches for those moving their on-premise implementations to the cloud, and those headaches gave rise to Docker. Docker is a lighter weight form of virtualization that allows for easier sharing of files, versioning, and configuration. Servers that could only host a few VMs can host thousands of Docker images, so providers get better bang for the buck for their server purchases. Further explanation of Docker is an article all by itself, but for now it’s import to realize that Docker needs to be part of any discussion of moving your applications to the cloud. DaaS (Desktop as a Service) Desktop computers are expensive for large corporations to implement and maintain. The cost of the OS, hardware, security software, productivity software, and more start to add up to where it makes a major impact on any corporation’s budget. Then just as they finish deploying new systems to everyone in the company, it’s time to start upgrading again because Microsoft just released a new OS. Another fact with most desktop computers is that they are heavily underutilized, and DaaS allows an IT department to dynamically allocate RAM and disk space based on user need. In addition backups and restores are a breeze in this environment, and if you are using a third party provider all you need to do is make a phone call when a restore of a file or desktop is needed. Plus upgrades to new operating systems are seamless because the DaaS provider takes care of them for you. The main advantage I see with DaaS is security. With one project I was involved with, we restored the state of each Desktop to a base configuration each night. While this did not affect user files, it did remove any malware that might have been accidently installed by a user clicking on the wrong email. Documents from Microsoft Office or Adobe products were scanned with a separate antivirus program residing on the storage system they were a part of, and the network appliance that we used did not allow for the execution of software. That made it very secure for the client I was working with. So what does a user have on their desktops? Luckily in recent years there has been an explosion of low cost computing devices, such as a Raspberry PI, that support Remote Desktop Protocol (RDP) so your users could access a windows desktop from the linux-based PI which you can get for a measely . DaaS is awesome for your average information worker, but for a power user like a software developer this setup in my experience doesn’t work well. Your average developer needs

Read More
gettectonic.com