Google Archives - gettectonic.com - Page 18
Salesforce Certifications

Web Pages That Helped With My Google Data Engineer Exam

Google Data Engineer Exam It seems like every day more resources appear to help you study for the Google Data Engineer certification, so I thought in the interests of being helping you aspiring data engineers with some URLs I found helpful. Life of a BigQuery streaming insert When to use different types of storage in Google cloud The Google Study Guide for the Data Engineer Exam A quick guide on Apache Hadoop (several questions on this) The content on this mobile gaming scenario was covered extensively. This Spotify architecture is helpful because if similar to the scenarios you will deal with on the test This example gives you an idea of what to use Big Table versus Big Query Data Transfer Options for Google Cloud Lots and Lots of Case Studies Look for more helpful posts on studying for your Data Engineer Exam! A Professional Data Engineer makes data usable and valuable for others by collecting, transforming, and publishing data. This individual evaluates and selects products and services to meet business and regulatory requirements. A Professional Data Engineer creates and manages robust data processing systems. This includes the ability to design, build, deploy, monitor, maintain, and secure data processing workloads. Recommended experience: 3+ years of industry experience including 1+ years designing and managing solutions using Google Cloud. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Roles in AI

Salesforce’s Quest for AI for the Masses

The software engine, Optimus Prime (not to be confused with the Autobot leader), originated in a basement beneath a West Elm furniture store on University Avenue in Palo Alto. A group of artificial intelligence enthusiasts within Salesforce, seeking to enhance the impact of machine learning models, embarked on this mission two years ago. While shoppers checked out furniture above, they developed a system to automate the creation of machine learning models. Thus Salesforce’s Quest for AI for the Masses started. Despite being initially named after the Transformers leader, the tie-in was abandoned, and Salesforce named its AI program Einstein. This move reflects the ambitious yet practical approach Salesforce takes in the AI domain. In March, a significant portion of Einstein became available to all Salesforce users, aligning with the company’s tradition of making advanced software accessible via the cloud. Salesforce, although now an industry giant, retains its scrappy upstart identity. When the AI trend gained momentum, the company aimed to create “AI for everyone,” focusing on making machine learning affordable and accessible to businesses. This populist mission emphasizes practical applications over revolutionary or apocalyptic visions. Einstein’s first widely available tool is the Einstein Intelligence module, designed to assist salespeople in managing leads effectively. It ranks opportunities based on factors like the likelihood to close, offering a practical application of artificial intelligence. While other tech giants boast significant research muscle, Salesforce focuses on providing immediate market advantages to its customers. Einstein Intelligence The Einstein Intelligence module employs machine learning to study historical data, identifying factors that predict future outcomes and adjusting its model over time. This dynamic approach allows for subtler and more powerful answers, making use of various data sources beyond basic Salesforce columns. Salesforce’s AI team strives to democratize AI by offering ready-made tools, ensuring businesses can benefit from machine learning without the need for extensive customization by data scientists. The company’s multi-tenant approach, serving 150,000 customers, keeps each company’s data separate and secure. Salesforce’s Quest for AI for the Masses To scale AI implementation across its vast customer base, Salesforce developed Optimus Prime. This system automates the creation of machine learning models for each customer, eliminating the need for extensive manual involvement. Optimus Prime, the AI that builds AIs, streamlines the process and accelerates model creation from weeks to just a couple of hours. Salesforce plans to expand Einstein’s capabilities, allowing users to apply it to more customized data and enabling non-programmers to build custom apps. The company’s long-term vision includes exposing more of its machine learning system to external developers, competing directly with AI heavyweights like Google and Microsoft in the business market. Originally published in WIRED magazine on August 2, 2017 and rewritten for this insight. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
cloud computing

Top Ten Reasons Why Tectonic Loves the Cloud

The Cloud is Good for Everyone – Why Tectonic loves the cloud Why tectonic loves the cloud Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
cloud computing

We Are All Cloud Users

My old company and several others are concerned about security, and feel more secure with being able to walk down the hall and see the servers working away in that air-conditioned closet. But who do you trust more:  your IT guy that looks at your IT security every once in a while or a provider like Google that has a team of engineers constantly monitoring its systems looking for security issues. In fact, Google has some of the best security specialists in the world.

Read More
abc

Alphabet Soup of Cloud Terminology

As with any technology, the cloud brings its own alphabet soup of terms. This insight will hopefully help you navigate your way through the terminology, and provide you the knowledge and power to make the decisions you need to make when considering a new cloud implementation. Here’s the list of terms we will cover in this article: Phew—that’s a lot. Let’s dig in to the definitions and examples to help drive home the meanings of the list of terms above. SaaS (Software as a Service) This is probably the most common implementation of cloud services end users experience. This is software that users access through their web browser. Some software may be installed locally to help augment functionality or provide a richer user experience, but the software installed locally has minimal impact on the user’s computer. Figure 1 provides a high-level overview of this concept. Figure 1 High-level overview of Software as a Service You are probably a user Facebook, Google docs, Office 365, Salesforce, or LinkedIn either at home or at work, so you’ve experienced SaaS first hand and probably for a long time. What SaaS tools are you using outside of those mentioned here? Reach out and let me know—I’m very curious. PaaS (Platform as a Service) PaaS allows a developer to deploy code to an environment that supports their software but they do not have full access to the operating system. In this case the developer has no server responsibility or server access. When I first started writing about cloud technology three years ago, this was kind of primitive service. The provider would just give you access to a folder somewhere on the server with just a bit of documentation and then you were on your own. Now there are tools, such as CloudFoundry, that allow a developer to deploy right from their Integrated Development Environment (IDE) or from a command line production release tool. Then CloudFoundry can take the transmitted release and install it correctly into the cloud environment. With a little trial and error, anyone with a bit of technical skills can deploy to a tool like CloudFoundry where the older style of PaaS took a lot of skill and experience to deploy correctly. IaaS (Infrastructure as a Service) Originally IaaS dealt with a provider giving a user access to a virtual machine located on a system in the provider’s data center. A virtual machine is an operating system that resides in a piece of software on the host computer. Virtual Box, Parallels and VMWare are examples of software that provide virtualization of operating systems called Virtual Machines (VM) Virtualization of servers was all the rage for a while, but when you try to scale within the cloud with multiple virtual servers there are a lot of drawbacks. First, it’s a lot of work to make VMs aware of each other and they don’t always share filesystems and resources easily. Plus, as your needs grow, VMs with a lot of memory and disk space are very expensive, and very often an application on a VM is only using a portion of the OS. For example, if you are deploying a tool that does data aggregation and runs as a service you won’t be taking advantage of the web server that might be running on server too. The issues mentioned in the previous paragraph are common headaches for those moving their on-premise implementations to the cloud, and those headaches gave rise to Docker. Docker is a lighter weight form of virtualization that allows for easier sharing of files, versioning, and configuration. Servers that could only host a few VMs can host thousands of Docker images, so providers get better bang for the buck for their server purchases. Further explanation of Docker is an article all by itself, but for now it’s import to realize that Docker needs to be part of any discussion of moving your applications to the cloud. DaaS (Desktop as a Service) Desktop computers are expensive for large corporations to implement and maintain. The cost of the OS, hardware, security software, productivity software, and more start to add up to where it makes a major impact on any corporation’s budget. Then just as they finish deploying new systems to everyone in the company, it’s time to start upgrading again because Microsoft just released a new OS. Another fact with most desktop computers is that they are heavily underutilized, and DaaS allows an IT department to dynamically allocate RAM and disk space based on user need. In addition backups and restores are a breeze in this environment, and if you are using a third party provider all you need to do is make a phone call when a restore of a file or desktop is needed. Plus upgrades to new operating systems are seamless because the DaaS provider takes care of them for you. The main advantage I see with DaaS is security. With one project I was involved with, we restored the state of each Desktop to a base configuration each night. While this did not affect user files, it did remove any malware that might have been accidently installed by a user clicking on the wrong email. Documents from Microsoft Office or Adobe products were scanned with a separate antivirus program residing on the storage system they were a part of, and the network appliance that we used did not allow for the execution of software. That made it very secure for the client I was working with. So what does a user have on their desktops? Luckily in recent years there has been an explosion of low cost computing devices, such as a Raspberry PI, that support Remote Desktop Protocol (RDP) so your users could access a windows desktop from the linux-based PI which you can get for a measely . DaaS is awesome for your average information worker, but for a power user like a software developer this setup in my experience doesn’t work well. Your average developer needs

Read More
gettectonic.com