Data Center Archives - gettectonic.com
Agentic AI is Here

On Premise Gen AI

In 2025, enterprises transitioning generative AI (GenAI) into production after years of experimentation are increasingly considering on-premises deployment as a cost-effective alternative to the cloud. Since OpenAI ignited the AI revolution in late 2022, organizations have tested large language models powering GenAI services on platforms like AWS, Microsoft Azure, and Google Cloud. These experiments demonstrated GenAI’s potential to enhance business operations while exposing the substantial costs of cloud usage. To avoid difficult conversations with CFOs about escalating cloud expenses, CIOs are exploring on-premises AI as a financially viable solution. Advances in software from startups and packaged infrastructure from vendors such as HPE and Dell are making private data centers an attractive option for managing costs. A survey conducted by Menlo Ventures in late 2024 found that 47% of U.S. enterprises with at least 50 employees were developing GenAI solutions in-house. Similarly, Informa TechTarget’s Enterprise Strategy Group reported a rise in enterprises considering on-premises and public cloud equally for new applications—from 37% in 2024 to 45% in 2025. This shift is reflected in hardware sales. HPE reported a 16% revenue increase in AI systems, reaching $1.5 billion in Q4 2024. During the same period, Dell recorded a record $3.6 billion in AI server orders, with its sales pipeline expanding by over 50% across various customer segments. “Customers are seeking diverse AI-capable server solutions,” noted David Schmidt, senior director of Dell’s PowerEdge server line. While heavily regulated industries have traditionally relied on on-premises systems to ensure data privacy and security, broader adoption is now driven by the need for cost control. Fortune 2000 companies are leading this trend, opting for private infrastructure over the cloud due to more predictable expenses. “It’s not unusual to see cloud bills exceeding 0,000 or even million per month,” said John Annand, an analyst at Info-Tech Research Group. Global manufacturing giant Jabil primarily uses AWS for GenAI development but emphasizes ongoing cost management. “Does moving to the cloud provide a cost advantage? Sometimes it doesn’t,” said CIO May Yap. Jabil employs a continuous cloud financial optimization process to maximize efficiency. On-Premises AI: Technology and Trends Enterprises now have alternatives to cloud infrastructure, including as-a-service solutions like Dell APEX and HPE GreenLake, which offer flexible pay-per-use pricing for AI servers, storage, and networking tailored for private data centers or colocation facilities. “The high cost of cloud drives organizations to seek more predictable expenses,” said Tiffany Osias, vice president of global colocation services at Equinix. Walmart exemplifies in-house AI development, creating tools like a document summarization app for its benefits help desk and an AI assistant for corporate employees. Startups are also enabling enterprises to build AI applications with turnkey solutions. “About 80% of GenAI requirements can now be addressed with push-button solutions from startups,” said Tim Tully, partner at Menlo Ventures. Companies like Ragie (RAG-as-a-service) and Lamatic.ai (GenAI platform-as-a-service) are driving this innovation. Others, like Squid AI, integrate custom AI agents with existing enterprise infrastructure. Open-source frameworks like LangChain further empower on-premises development, offering tools for creating chatbots, virtual assistants, and intelligent search systems. Its extension, LangGraph, adds functionality for building multi-agent workflows. As enterprises develop AI applications internally, consulting services will play a pivotal role. “Companies offering guidance on effective AI tool usage and aligning them with business outcomes will thrive,” Annand said. This evolution in AI deployment highlights the growing importance of balancing technological innovation with financial sustainability. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
AI Energy Solution

AI Energy Solution

Could the AI Energy Solution Make AI Unstoppable? The Rise of Brain-Based AI In 2002, Jason Padgett, a furniture salesman from Tacoma, Washington, experienced a life-altering transformation after a traumatic brain injury. Following a violent assault, Padgett began to perceive the world through intricate patterns of geometry and fractals, developing a profound, intuitive grasp of advanced mathematical concepts—despite no formal education in the subject. His extraordinary abilities, emerging from the brain’s adaptation to injury, revealed an essential truth: the human brain’s remarkable capacity for resilience and reorganization. This phenomenon underscores the brain’s reliance on inhibition, a critical mechanism that silences or separates neural processes to conserve energy, clarify signals, and enable complex cognition. Researcher Iain McGilchrist highlights that this ability to step back from immediate stimuli fosters reflection and thoughtful action. Yet this foundational trait—key to the brain’s efficiency and adaptability—is absent from today’s dominant AI models. Current AI systems, like Transformers powering tools such as ChatGPT, lack inhibition. These models rely on probabilistic predictions derived from massive datasets, resulting in inefficiencies and an inability to learn independently. However, the rise of brain-based AI seeks to emulate aspects of inhibition, creating systems that are not only more energy-efficient but also capable of learning from real-world, primary data without constant retraining. The AI Energy Problem Today’s AI landscape is dominated by Transformer models, known for their ability to process vast amounts of secondary data, such as scraped text, images, and videos. While these models have propelled significant advancements, their insatiable demand for computational power has exposed critical flaws. As energy costs rise and infrastructure investment balloons, the industry is beginning to reevaluate its reliance on Transformer models. This shift has sparked interest in brain-inspired AI, which promises sustainable solutions through decentralized, self-learning systems that mimic human cognitive efficiency. What Brain-Based AI Solves Brain-inspired models aim to address three fundamental challenges with current AI systems: The human brain’s ability to build cohesive perceptions from fragmented inputs—like stitching together a clear visual image from saccades and peripheral signals—serves as a blueprint for these models, demonstrating how advanced functionality can emerge from minimal energy expenditure. The Secret to Brain Efficiency: A Thousand Brains Jeff Hawkins, the creator of the Palm Pilot, has dedicated decades to understanding the brain’s neocortex and its potential for AI design. His Thousand Brains Theory of Intelligence posits that the neocortex operates through a universal algorithm, with approximately 150,000 cortical columns functioning as independent processors. These columns identify patterns, sequences, and spatial representations, collaborating to form a cohesive perception of the world. Hawkins’ brain-inspired approach challenges traditional AI paradigms by emphasizing predictive coding and distributed processing, reducing energy demands while enabling real-time learning. Unlike Transformers, which centralize control, brain-based AI uses localized decision-making, creating a more scalable and adaptive system. Is AI in a Bubble? Despite immense investment in AI, the market’s focus remains heavily skewed toward infrastructure rather than applications. NVIDIA’s data centers alone generate 5 billion in annualized revenue, while major AI applications collectively bring in just billion. This imbalance has led to concerns about an AI bubble, reminiscent of the early 2000s dot-com and telecom busts, where overinvestment in infrastructure outpaced actual demand. The sustainability of current AI investments hinges on the viability of new models like brain-based AI. If these systems gain widespread adoption within the next decade, today’s energy-intensive Transformer models may become obsolete, signaling a profound market correction. Controlling Brain-Based AI: A Philosophical Divide The rise of brain-based AI introduces not only technical challenges but also philosophical ones. Scholars like Joscha Bach argue for a reductionist approach, constructing intelligence through mathematical models that approximate complex phenomena. Others advocate for holistic designs, warning that purely rational systems may lack the broader perspective needed to navigate ethical and unpredictable scenarios. This philosophical debate mirrors the physical divide in the human brain: one hemisphere excels in reductionist analysis, while the other integrates holistic perspectives. As AI systems grow increasingly complex, the philosophical framework guiding their development will profoundly shape their behavior—and their impact on society. The future of AI lies in balancing efficiency, adaptability, and ethical design. Whether brain-based models succeed in replacing Transformers will depend not only on their technical advantages but also on our ability to guide their evolution responsibly. As AI inches closer to mimicking human intelligence, the stakes have never been higher. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
DHS Introduces AI Framework to Protect Critical Infrastructure

DHS Introduces AI Framework to Protect Critical Infrastructure

The Department of Homeland Security (DHS) has unveiled the Roles and Responsibilities Framework for Artificial Intelligence in Critical Infrastructure, a voluntary set of guidelines designed to ensure the safe and secure deployment of AI across the systems that power daily life. From energy grids to water systems, transportation, and communications, critical infrastructure increasingly relies on AI for enhanced efficiency and resilience. While AI offers transformative potential—such as detecting earthquakes, optimizing energy usage, and streamlining logistics—it also introduces new vulnerabilities. Framework Overview The framework, developed with input from cloud providers, AI developers, critical infrastructure operators, civil society, and public sector organizations, builds on DHS’s broader policies from 2023, which align with White House directives. It aims to provide a shared roadmap for balancing AI’s benefits with its risks. AI Vulnerabilities in Critical Infrastructure The DHS framework categorizes vulnerabilities into three key areas: The guidelines also address sector-specific vulnerabilities and offer strategies to ensure AI strengthens resilience while minimizing misuse risks. Industry and Government Support Arvind Krishna, Chairman and CEO of IBM, lauded the framework as a “powerful tool” for fostering responsible AI development. “We look forward to working with DHS to promote shared and individual responsibilities in advancing trusted AI systems.” Marc Benioff, CEO of Salesforce, emphasized the framework’s role in fostering collaboration among stakeholders while prioritizing trust and accountability. “Salesforce is committed to humans and AI working together to advance critical infrastructure industries in the U.S. We support this framework as a vital step toward shaping the future of AI in a safe and sustainable manner.” DHS Secretary Alejandro N. Mayorkas highlighted the urgency of proactive action. “AI offers a once-in-a-generation opportunity to improve the strength and resilience of U.S. critical infrastructure, and we must seize it while minimizing its potential harms. The framework, if widely adopted, will help ensure the safety and security of critical services.” DHS Recommendations for Stakeholders A Call to Action DHS encourages widespread adoption of the framework to build safer, more resilient critical infrastructure. By prioritizing trust, transparency, and collaboration, this initiative aims to guide the responsible integration of AI into essential systems, ensuring they remain secure and effective as technology continues to evolve. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Generative ai energy consumption

Growing Energy Consumption in Generative AI

Growing Energy Consumption in Generative AI, but ROI Impact Remains Unclear The rising energy costs associated with generative AI aren’t always central in enterprise financial considerations, yet experts suggest IT leaders should take note. Building a business case for generative AI involves both obvious and hidden expenses. Licensing fees for large language models (LLMs) and SaaS subscriptions are visible expenses, but less apparent costs include data preparation, cloud infrastructure upgrades, and managing organizational change. Growing Energy Consumption in Generative AI. One under-the-radar cost is the energy required by generative AI. Training LLMs demands vast computing power, and even routine AI tasks like answering user queries or generating images consume energy. These intensive processes require robust cooling systems in data centers, adding to energy use. While energy costs haven’t been a focus for GenAI adopters, growing awareness has prompted the International Energy Agency (IEA) to predict a doubling of data center electricity consumption by 2026, attributing much of the increase to AI. Goldman Sachs echoed these concerns, projecting data center power consumption to more than double by 2030. For now, generative AI’s anticipated benefits outweigh energy cost concerns for most enterprises, with hyperscalers like Google bearing the brunt of these costs. Google recently reported a 13% increase in greenhouse gas emissions, citing AI as a major contributor and suggesting that reducing emissions might become more challenging with AI’s continued growth. Growing Energy Consumption in Generative AI While not a barrier to adoption, energy costs play into generative AI’s long-term viability, noted Scott Likens, global AI engineering leader at PwC, emphasizing that “there’s energy being used — you don’t take it for granted.” Energy Costs and Enterprise Adoption Generative AI users might not see a line item for energy costs, yet these are embedded in fees. Ryan Gross of Caylent points out that the costs are mainly tied to model training and inferencing, with each model query, though individually minor, adding up over time. These expenses are often spread across the customer base, as companies pay for generative AI access through a licensing model. A PwC sustainability study showed that GenAI power costs, particularly from model training, are distributed among licensees. Token-based pricing for LLM usage also reflects inferencing costs, though these charges have decreased. Likens noted that the largest expenses still come from infrastructure and data management rather than energy. Potential Efficiency Gains Though energy isn’t a primary consideration, enterprises could reduce consumption indirectly through technological advancements. Newer, more cost-efficient models like OpenAI’s GPT-4o mini are 60% less expensive per token than prior versions, enabling organizations to deploy GenAI on a larger scale while keeping costs lower. Small, fine-tuned models can be used to address latency and lower energy consumption, part of a “multimodel” approach that can provide different accuracy and latency levels with varying energy demands. Agentic AI also offers opportunities for cost and energy savings. By breaking down tasks and routing them through specialized models, companies can minimize latency and reduce power usage. According to Likens, using agentic architecture could cut costs and consumption, particularly when tasks are routed to more efficient models. Rising Data Center Energy Needs While enterprises may feel shielded from direct energy costs, data centers bear the growing power demand. Cooling solutions are evolving, with liquid cooling systems becoming more prevalent for AI workloads. As data centers face the “AI growth cycle,” the demand for energy-efficient cooling solutions has fueled a resurgence in thermal management investment. Liquid cooling, being more efficient than air cooling, is gaining traction due to the power demands of AI and high-performance computing. IDTechEx projects that data center liquid cooling revenue could exceed $50 billion by 2035. Meanwhile, data centers are exploring nuclear power, with AWS, Google, and Microsoft among those considering nuclear energy as a sustainable solution to meet AI’s power demands. Future ROI Considerations While enterprises remain shielded from the full energy costs of generative AI, careful model selection and architectural choices could help curb consumption. PwC, for instance, factors in the “carbon impact” as part of its GenAI deployment strategy, recognizing that energy considerations are now a part of the generative AI value proposition. As organizations increasingly factor sustainability into their tech decisions, energy efficiency might soon play a larger role in generative AI ROI calculations. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
The Growing Role of AI in Cloud Management

The Growing Role of AI in Cloud Management

AI technologies are redefining cloud management by automating IT systems, improving security, optimizing cloud costs, enhancing data management, and streamlining the provisioning of AI services across complex cloud ecosystems. With the surging demand for AI, its ability to address technological complexities makes a unified cloud management strategy indispensable for IT teams. Cloud and security platforms have steadily integrated AI and machine learning to support increasingly autonomous IT operations. The rapid rise of generative AI (GenAI) has further spotlighted these AI capabilities, prompting vendors to prioritize their development and implementation. Adnan Masood, Chief AI Architect at UST, highlights the transformative potential of AI-driven cloud management, emphasizing its ability to oversee vast data centers hosting millions of applications and services with minimal human input. “AI automates tasks such as provisioning, scaling, cost management, monitoring, and data migration,” Masood explains, showcasing its wide-ranging impact. From Reactive to Proactive Cloud Management Traditionally, CloudOps relied heavily on manual intervention and expertise. AI has shifted this paradigm, introducing automation, predictive analytics, and intelligent decision-making. This evolution enables enterprises to transition from reactive, manual management to proactive, self-optimizing cloud environments. Masood underscores that this shift allows cloud systems to self-manage and optimize with minimal human oversight. However, organizations must navigate challenges, including complex data integration, real-time processing limitations, and model accuracy concerns. Business hurdles like implementation costs, uncertain ROI, and maintaining the right balance between AI automation and human oversight also require careful evaluation. AI’s Transformation of Cloud Computing AI has reshaped cloud management into a more proactive and efficient process. Key applications include: “AI enhances efficiency, scalability, and flexibility for IT teams,” says Agustín Huerta, SVP of Digital Innovation at Globant. He views AI as a pivotal enabler of automation and optimization, helping businesses adapt to rapidly changing environments. AI also automates repetitive tasks such as provisioning, performance monitoring, and cost management. More importantly, it strengthens security across cloud infrastructure by detecting misconfigurations, vulnerabilities, and malicious activities. Nick Kramer of SSA & Company highlights how AI-powered natural language interfaces simplify cloud management, transforming it from a technical challenge to a logical one. With conversational AI, business users can manage cloud operations more efficiently, accelerating problem resolution. AI-Enabled Cloud Management Tools Ryan Mallory, COO at Flexential, categorizes AI-powered cloud tools into: The Rise of Self-Healing Cloud Systems AI enables cloud systems to detect, resolve, and optimize issues with minimal human intervention. For instance, AI can identify system failures and trigger automatic remediation, such as restarting services or reallocating resources. Over time, machine learning enhances these systems’ accuracy and reliability. Key Applications of AI in Cloud Management AI’s widespread applications in cloud computing include: Benefits of AI in Cloud Management AI transforms cloud management by enabling autonomous systems capable of 24/7 monitoring, self-healing, and optimization. This boosts system reliability, reduces downtime, and provides businesses with deeper analytical insights. Chris Vogel from S-RM emphasizes that AI’s analytical capabilities go beyond automation, driving strategic business decisions and delivering measurable value. Challenges of AI in Cloud Management Despite its advantages, AI adoption in cloud management presents challenges, including: AI’s Impact on IT Departments AI’s growing influence on cloud management introduces new responsibilities for IT teams, including managing unauthorized AI systems, ensuring data security, and maintaining high-quality data for AI applications. IT departments must provide enterprise-grade AI solutions that are private, governed, and efficient while balancing the costs and benefits of AI integration. Future Trends in AI-Driven Cloud Management Experts anticipate that AI will revolutionize cloud management, much like cloud computing reshaped IT a decade ago. Prasad Sankaran from Cognizant predicts that organizations investing in AI for cloud management will unlock opportunities for faster innovation, streamlined operations, and reduced technical debt. As AI continues to evolve, cloud environments will become increasingly autonomous, driving efficiency, scalability, and innovation across industries. Businesses embracing AI-driven cloud management will be well-positioned to adapt to the complexities of tomorrow’s IT landscape. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Where Will AI Take Us?

Where Will AI Take Us?

Author Jeremy Wagstaff wrote a very thought provoking article on the future of AI, and how much of it we could predict based on the past. This insight expands on that article. Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think and learn. These machines can perform tasks that typically require human intelligence, such as visual perception, speech recognition, decision-making, and language translation. Many people think of artificial intelligence in the vein of how they personally use it. Some people don’t even realize when they are using it. Artificial intelligence has long been a concept in human mythology and literature. Our imaginations have been grabbed by the thought of sentient machines constructed by humans, from Talos, the enormous bronze automaton (self-operating machine) that safeguarded the island of Crete in Greek mythology, to the spacecraft-controlling HAL in 2001: A Space Odyssey. Artificial Intelligence comes in a variety of flavors, if you will. Artificial intelligence can be categorized in several ways, including by capability and functionality: You likely weren’t even aware of all of the above categorizations of artificial intelligence. Most of us still would sub set into generative ai, a subset of narrow AI, predictive ai, and reactive ai. Reflect on the AI journey through the Three C’s – Computation, Cognition, and Communication – as the guiding pillars for understanding the transformative potential of AI. Gain insights into how these concepts converge to shape the future of technology. Beyond a definition, what really is artificial intelligence, who makes it, who uses it, what does it do and how. Artificial Intelligence Companies – A Sampling AI and Its Challenges Artificial intelligence (AI) presents a novel and significant challenge to the fundamental ideas underpinning the modern state, affecting governance, social and mental health, the balance between capitalism and individual protection, and international cooperation and commerce. Addressing this amorphous technology, which lacks a clear definition yet pervades increasing facets of life, is complex and daunting. It is essential to recognize what should not be done, drawing lessons from past mistakes that may not be reversible this time. In the 1920s, the concept of a street was fluid. People viewed city streets as public spaces open to anyone not endangering or obstructing others. However, conflicts between ‘joy riders’ and ‘jay walkers’ began to emerge, with judges often siding with pedestrians in lawsuits. Motorist associations and the car industry lobbied to prioritize vehicles, leading to the construction of vehicle-only thoroughfares. The dominance of cars prevailed for a century, but recent efforts have sought to reverse this trend with ‘complete streets,’ bicycle and pedestrian infrastructure, and traffic calming measures. Technology, such as electric micro-mobility and improved VR/AR for street design, plays a role in this transformation. The guy digging out a road bed for chariots and Roman armies likely considered none of this. Addressing new technology is not easy to do, and it’s taken changes to our planet’s climate, a pandemic, and the deaths of tens of millions of people in traffic accidents (3.6 million in the U.S. since 1899). If we had better understood the implications of the first automobile technology, perhaps we could have made better decisions. Similarly, society should avoid repeating past mistakes with AI. The market has driven AI’s development, often prioritizing those who stand to profit over consumers. You know, capitalism. The rapid adoption and expansion of AI, driven by commercial and nationalist competition, have created significant distortions. Companies like Nvidia have soared in value due to AI chip sales, and governments are heavily investing in AI technology to gain competitive advantages. Listening to AI experts highlights the enormity of the commitment being made and reveals that these experts, despite their knowledge, may not be the best sources for AI guidance. The size and impact of AI are already redirecting massive resources and creating new challenges. For example, AI’s demand for energy, chips, memory, and talent is immense, and the future of AI-driven applications depends on the availability of computing resources. The rise in demand for AI has already led to significant industry changes. Data centers are transforming into ‘AI data centers,’ and the demand for specialized AI chips and memory is skyrocketing. The U.S. government is investing billions to boost its position in AI, and countries like China are rapidly advancing in AI expertise. China may be behind in physical assets, but it is moving fast on expertise, generating almost half of the world’s top AI researchers (Source: New York Times). The U.S. has just announced it will provide chip maker Intel with $20 billion in grants and loans to boost the country’s position in AI. Nvidia is now the third largest company in the world, entirely because its specialized chips account for more than 70 percent of AI chip sales. Memory-maker Micro has mostly run out of high-bandwidth memory (HBM) stocks because of the chips’ usage in AI—one customer paid $600 million up-front to lock in supply, according to a story by Stack. Back in January, the International Energy Agency forecast that data centers may more than double their electrical consumption by 2026 (Source: Sandra MacGregor, Data Center Knowledge). AI is sucking up all the payroll: Those tech workers who don’t have AI skills are finding fewer roles and lower salaries—or their jobs disappearing entirely to automation and AI (Source: Belle Lin at WSJ). Sam Altman of OpenAI sees a future where demand for AI-driven apps is limited only by the amount of computing available at a price the consumer is willing o pay. “Compute is going to be the currency of the future. I think it will be maybe the most precious commodity in the world, and I think we should be investing heavily to make a lot more compute.” Sam Altman, OpenAI CEO This AI buildup is reminiscent of past technological transformations, where powerful interests shaped outcomes, often at the expense of broader societal considerations. Consider early car manufacturers. They focused on a need for factories, components, and roads.

Read More
Better Sales and Services with Salesforce Unlimited Edition

Granular Data Center Overview

Granular Data Center Overview in Marketing Cloud Intelligence The Granular Data Center is an advanced feature tailored for ingesting detailed, raw data into the system. This data can reach a scale of hundreds of millions or even billions of rows due to its granularity. Unlike other data stream types, usage and pricing are based on terabytes of storage rather than row count. Ideal data types for Granular Data Center streams include keyword-level data, event-level data, logs, and precise geodata. Granular Data Center streams generate corresponding tables of data specific to a workspace. All data stored in the Granular Data Center fully complies with GDPR regulations and requirements. The Granular Data Center is a premium feature. For inquiries about purchasing, please contact a Marketing Cloud Intelligence representative at Salesforce. Deprovisioning the Granular Data Center add-on from an account triggers the following actions: Note: System admins and higher can still access the Granular Data Center for 90 days after unchecking the checkbox. Access will be unavailable after this period. Note: System admins and higher can continue running SQL queries and exports for 90 days. After this period, all Granular Data Center data streams are automatically deleted, along with the data. When retrieving data from the Granular Data Center, be mindful of these timeout limits: Enabling the Granular Data Center in a Workspace Purchasing the Granular Data Center automatically activates it in the account, but an admin must enable it in the workspace to make the Granular Data Center tab visible. Viewing Granular Data Center Data The Granular Data Center landing page provides an overview of all created data streams in that workspace. Users can manage ingested data, aggregations, extracted data, share data streams, create queries, and more from this centralized location. Creating Granular Data Center Data Streams Generate a Granular Data Center data stream to ingest detailed data, such as event-level or keyword-level data. Mapping Granular Data Center Data Upon file upload or usage of a technical vendor, users are directed to a mapping preview screen where they can verify data identification, modify mapping, add mapping formulas, and more. Each uploaded dataset creates a dynamic table tailored to the loaded data type, impacting data load options and behavior. Querying Granular Data Centers Access and extract data from Granular Data Centers within your workspace. Users can also query Granular Data Centers in other workspaces via data sharing. Queries can be manually crafted using an SQL editor or created effortlessly with the Query Builder. Visualizing Granular Data Center Data The Entity-Relationship Diagram (ERD) visually represents tables and connections between specific dimensions. Each block symbolizes a table containing available fields, with lines denoting connections between tables based on specific dimensions. Sharing Granular Data Centers Relevant Granular Data Centers can be shared across workspaces within the same account. Deleting Data from a Granular Data Center To align with data protection regulations, users have the option to delete data from a Granular Data Center. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Salesforce Shield Data Monitoring and Encryption

Salesforce Shield Encryption

Salesforce Uses a symmetric encryption key to encrypt the customer data that it stores. (The symmetric encryption used isAES with 256-bit keys using CBC mode, PKCS5 padding, and random initialization vector (IV).) Salesforce Shield Encryption works in this way. 1) There are three channels to enter data into Salesforce.com. One: user via desktop using a browser, two: users via mobile device or three: a system making an API call directly into Salesforce. 2) The Application servers in the salesforce data centers serve as a gateway to intercepting requests coming in determining which data elements should be encrypted or decrypted and then applying the appropriate encryption credentials. The Data Encryption Key (which is also the decryption key) is never transmitted or even written to disk (persisted). 3) It is created/derived in the Salesforce Platform and never leaves. It is created in a component of the platform called the Key Derivation Server. The Encryption key is derived/created from a combination of a Salesforce component and customer/tenant specific component. These are called secrets. Sometimes they are also referred to as key fragments. 4) The Encryption key in Salesforce Shield Encryption is generated from the master secret (Salesforce component) and the tenant secret (customer component) using PBKDF2 (Password-Based Key Derivation Function 2). 5) The Derived data encryption key is then securely passed to the encryption service and held in the cache of an application server. – Salesforce Retrieve The Data Encryption Key from the cache and performs the encryption. – To decrypt the data Salesforce Reads the encrypted data from the database and if the encryption (decryption) key is not in the cache then it needs to derive it again using the associated tenant secret, and then it decrypts using the key and the associated iv. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Cloud Based Managed Services

Cloud Based Managed Services

At Tectonic, our aim is to enable you to concentrate on your business operations while we handle the intricacies of technology. Our team oversees all technology assets and evaluates your IT systems to ensure your employees have the necessary equipment to support their work. We work diligently in the background to maintain the seamless functioning of your business, encompassing tasks such as consistent backups, 24/7 server monitoring, event logging, and more. Cloud Based Managed Services. Managed Cloud Cloud managed services, or managed cloud, refer to applications, services, or ecosystems in the cloud managed by a third-party organization. These services include IT tasks like engineering on demand, operations management, 24/7 help desk support, hosting, and implementation. Cloud-based service involves the organized administration of cloud computing products and services. It encompasses processes, strategies, policies, and technology used to control and maintain public and private clouds, hybrid clouds, or multicloud environments. Cloud infrastructure managed services involve companies like Microsoft, AWS, HubSpot, IBM, and Google Cloud. This allows businesses to receive personalized tech support from specialists in specific software providers. Examples of cloud-based services include file storage and backup, web-based email, and project management tools. SaaS cloud service providers include Dropbox, G Suite, Microsoft Office 365, and Slack. While cloud service providers are cost-effective, they lack the robustness of fully managed IT services. With Tectonic, managed cloud services provide scalable solutions, offering benefits like consistent backups, disaster recovery, and flexibility to adapt to changing business requirements without downtime. Managed services differ from SaaS as they go beyond software provision, often handling networking and hardware requirements. Managed cloud services offer hybrid IT and cloud administration, with providers offering unique values in areas like migration, optimization, security, and configuration. Advantages of Managed Cloud Service Providers (MCSPs) include resource optimization, cloud service integration, and predictable spending. However, potential drawbacks include high costs, optimization impacting performance, and security risks due to multi-tenancy. Amid the global COVID-19 pandemic, cloud adoption has surged, with companies embracing remote work models. Businesses increasingly partner with cloud managed service providers to optimize IT infrastructure, address regulatory compliance, and ensure security in the cloud environment. Local servers, physically located on-premises or in private data centers, are not considered cloud services. They host applications, websites, or services within a specific organization’s infrastructure. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Salesforce Cloud Hosted CRM

Is Salesforce Hosted in the Cloud?

Is Salesforce cloud hosted? What about Salesforce data? Salesforce Cloud is a suite of cloud-based customer relationship management software solutions that help businesses connect with customers, close deals, and deliver service. Was Salesforce always cloud based? Salesforce was founded in 1999 by Marc Benioff, a sales executive at Oracle, one of the largest software companies in the world. The company was founded on a single, bold premise – that software should be made available to the masses, on a 24/7 basis, over a global cloud computing infrastructure. Marc Russell Benioff is an American internet entrepreneur and philanthropist. Benioff is best known as the co-founder, chairman and CEO of the software company Salesforce, as well as being the owner of Time magazine since 2018. (Salesforce), a leading customer relationship management (CRM) company, chose Amazon Web Services (AWS) as its primary cloud provider in 2016. Today, Salesforce and AWS have a global strategic relationship focused on technical alignment and joint development. Salesforce remains a cloud hosted solution. Where is Salesforce data hosted? It depends on whether your org is on Hyperforce or not. If you’re on Hyperforce, then it will be stored on AWS. If you’re not on Hyperforce, then it will be stored in Oracle Database within Salesforce Data Centers. In both cases, Salesforce is cloud hosted. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Cloud Managed Services

Advantages of a Cloud Managed Service Provider

Considering outsourcing your IT management to a cloud managed service provider? Here are several benefits of opting for a cloud expert like Tectonic: Cost Savings: Predictable, Recurring Monthly Costs: Future-Proof Technology: Custom and Integrated Service: Robust Infrastructure: Centralized Network Services and Applications: Coverage on All Service Levels: Disaster Recovery: Fast Response Times: Vendor Interfacing: Tectonic offers Managed Services for all your Salesforce platform IT needs.  Contact us today to get started. Like2 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
cloud computing

Top Ten Reasons Why Tectonic Loves the Cloud

The Cloud is Good for Everyone – Why Tectonic loves the cloud Why tectonic loves the cloud Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
gettectonic.com