Leseprobe
1.1 Analytics: An overview and timeline
Before I proceed, let me define what analytics or analytical computing is all about. Analytical computing (as opposed to operational computing), in its simplest rendition, is about using large volumes of data (generally historical in nature—anywhere from a year or more old) to detect trends, patterns, or anomalies and make better decisions. Analytics is at the heart of what used to be called a decision support system (DSS)—a term that is teetering on the brink of obsolescence today due to the ubiquity of analytics.
Figure 1.1: Analytics timeline (Source: https://www.fico.com/en/latest-thinking/infographic/the-analytics-big-bang)
1.1.1 The early years
The infographic shown in Figure 1.1 gives a good sense of how analytics has evolved over the past seven to eight decades. And it explodes the myth that analytics is a recent innovation. Probably the first documented foray into analytics was in 1944 with the Manhattan Project. The team ran computer simulations to predict the behavior of nuclear reactions. Needless to say, it was rare in those days to use computers for analytics primarily because computers and computing technology was in an embryonic state and was still in the realm of science fiction. The fifties and the sixties benefitted tremendously from advances made in computer science (in both hardware and software areas) and the ENIAC (the most powerful computer in its day) was able to carry out simulations of weather models and make weather predictions. It was also in the sixties that sophisticated software algorithms were able to generate the answer to the shortest path between any two points, thereby leading to optimization of logistics and air travel. However, the technology revolution hadn’t yet taken off. Computers were still defined by their size—the bigger, the better—and due to their prohibitive cost it was only the biggest of companies and research organizations (that had generous funding from their primary benefactors, generally the government) that seemed to have a stranglehold on these amazing machines.
1.1.2 The preteens
The computing revolution truly took off in the seventies as Moore’s Law manifested itself not only in the inverse relationship between processing speed and size of the processor, but also in the rapidly plummeting cost of computers. A slew of computing technologies went mainstream, including personal computing that ensured that the digital divide between the haves and have-nots would rapidly shrink, if not vanish. It also ensured a steady transition in big corporations from mainframes to personal computing that rapidly expanded overall computing capabilities of an organization. Mid-size and small companies that were excluded from the computing race could now afford to buy computers. And three companies that did not even exist in the sixties came to establish themselves as forces to be reckoned with: Microsoft, SAP, and Oracle. However, none of these had much of a role to play in the analytics realm in their early days. Microsoft was trying to corner the market for operating systems and office computing, SAP had just virtually started a new category (enterprise resource planning, or ERP), and Oracle had databases and database management in its sights.
Advances in analytics were not very widespread during the seventies and eighties (barring some research-oriented heavy-duty analytics), the value of a DSS and the role that analytical computing played in a DSS were established. The challenge was that the demand for analytics was neither particularly dominant nor were companies willing to sacrifice their operational needs for the still somewhat vague notion of analytics.
1.1.3 The teens and early adulthood
As we entered the nineties, devices got smaller, faster, smarter, and cheaper. The rapid availability of the Internet during this decade virtually guaranteed that computing and connectivity had almost become synonymous. Organizations and people had access to all kinds of data. The data being generated was growing at a colossal rate, which made organizations and people more knowledgeable since all this data could be crunched or mined to make them smarter. Enter analytics…During the nineties, upstarts such as Amazon, eBay, and Google (to a limited extent) started using analytics to get a better understanding of people’s buying habits to tailor their offerings appropriately and to also use the collected data to make predictions of what consumers might be interested in buying in the future.
Meanwhile, SAP, Oracle, and Microsoft had become behemoths in their respective areas. While SAP strode the world of ERP as a colossus, Oracle became the undisputed master of relational databases, and Microsoft had captured not only the operating system market with Windows but also the business applications market with its Microsoft Office suite. IBM was still one of the original giants and it was doing a slew of things in both hardware and software. It’s important to note that none of these giants were taking a serious look at data warehousing, business intelligence, or analytics. At best, they were casually flirting with the idea of analytics. But some visionary companies that did not have to deal with any of the commercial and legacy-imposed burdens had started making strong forays into the world of analytics like SAS, Microstrategy, Tableau, BusinessObjects, etc. Their quickly expanding bases established that the analytics market was on the cusp of a boom.
1.1.4 Coming of age
As we entered the new millennium, the entire IT industry was on a tear and things were changing at a dizzying speed. Consider all the disruptive innovations of the first few years of the new millennium:
- Connectivity to the Internet had become commonplace in the developed world with the developing nations taking big strides. The era of the dial-up (and thus slow downloads and long waits) was practically over and connectivity was accelerating rapidly. Advances in networking technology, epitomized by broadband cable and Wi-Fi, made it easier for large volumes of data to be streamed in real time with rapidly decreasing incidences of buffering.
- The PC revolution had yielded to the laptop era, which itself was in danger of losing its preeminent position to smart devices like smart phones and tablets. The rapid advance in mobility has had tremendous implications—it has spawned the ‘app’ economy, geographical barriers have generally crumbled, and data about conceivably anything you care to know about is a tap or finger-swipe away.
- The venerable Moore’s Law was still proving itself true and computing and storage prices were dropping even faster. In the realm of storage, innovations such as in-memory computing and advanced compression algorithms combined to make it easier to store massive volumes of data.
- Grid computing (initially championed by IBM) and cloud computing (with Amazon taking the lead) was widely being embraced by a variety of enterprises. This trend ensured that availability to world-class computing capabilities on demand was just a credit card swipe away.
- Although the late nineties saw a spike in collaboration and social networking in the form of chat rooms, the lack of bandwidth and limited coverage of the Internet kept this industry shackled. With the rapid rise of connectivity both in terms of speed and coverage, collaboration and social networking started growing exponentially. The initial years of the new millennium saw the emergence of upstarts like MySpace (now defunct), LinkedIn, and some less heralded ones like Second Life and Del.icio.us. Then came Facebook and Twitter, names that are now practically synonymous with social media. Then came Spotify, Instagram, Snapchat, and Whatsapp. These are some of the more well-known names. There are many other large non-U.S.-based social media apps such as (now acquired) Orkut, WeChat, and Sina Weibo. Suffice it to say that if you are connected to the Internet in some shape or form, you are a social networker.
- The global economy had integrated to such an extent that technology made all physical boundaries meaningless, global supply chains had become the norm, and competition was so intense that you had to react almost instantaneously to a certain situation (planned or unplanned) to sustain your competitive edge. While transaction processing applications were still necessary, it was no longer sufficient for companies to do data analysis with such applications.
What exactly is the key message that emerges from the convergence of these trends?
1. We are inundated by data coming at us in all directions in all shapes and forms. It is only likely to get even more relentless. Today’s Big Data could seem puny in comparison to the growth in volume, velocity, and variety a few months from now.
2. The expectation that we (whether as individuals or enterprises) should be able to make sense of this data in real time has almost become a right. With the ever-increasing access to cheap, but extremely powerful computing resources, software applications need to seamlessly incorporate analytics capabilities into their offerings.
3. Although historical or prescriptive analysis is important insofar as it helps explain what went right or wrong, it is equally (if not more) important to make reasonable projections (if not predictions) for the future (predictive analytics).
4. Data analytics cannot just be the prerogative of data scientists. The fruits of data analytics need to be enjoyed by everyone in the enterprise from entry through executive levels regardless of technical expertise. Data analytics needs to be a commodity that is made available to consumers (and maybe with some effort from the data scientists) on demand.
Alle Inhalte. Mehr Informationen. Jetzt entdecken.
et.training - Ihre Lernplattform für SAP-Software
- Zugriff auf alle Lerninhalte1
- Regelmäßige Neuerscheinungen
- Intelligenter Suchalgorithmus
- Innovatives Leseerlebnis
- Maßgeschneidere Lernpfade
- Zertifikate & QA-Tests2
1 Sie erhalten Zugriff auf alle Lerninhalte. Online-Trainings, Zertifikate sind NICHT Teil der Flatrate.
2 Weitere Informationen auf Anfrage.