By Dean Furst
The new
millennium is barely a teenager, but has already developed an identity of its
own, Cisco estimates “…by the end of this decade, the number of connected
devices will grow from 9 billion today to over 50 billion.”[1]
What’s even more staggering is the amount of data that’ll be transmitted,
shared, and downloaded between them. Online information will be collected
through social media sites, search engines, downloads, uploads, and real time
security cameras in cars and buildings. There are algorithms to evaluate how drivers
react to incoming traffic, how consumers conduct themselves in department
stores, and even ones to predict customer’s next online click before they've made the previous one. All of these algorithms base their documentation off actions
made by consumers.
The
problem with ‘big data’ is that few corporations have the database management
systems to process of sheer size of the information. Usually only governmental
entities and organizations with advanced central computer processing power have
the ability to make sense of, identify trends, and make conclusions from the
data. It’s estimated from a recent IBM study that surveyed 1000 business
professionals said that “63% of respondents indicated that the use of
information (including big data) and analytics is creating a competitive
advantage for their organizations–a 70% increase in the past two years alone.” [2] In
the modern marketplace, the abilities stated above are creating a bigger
inequality (or advantage, however you wish to view it) between the two sets of
organizations, those who have the processing power to understand ‘big data’,
and those who lack it.
Seeing
as internet traffic will soon pass 100 billion terabytes in a few short years,
Silicon Valley startups are banding together under the same banner; providing
cost-sensitive way to collect and interpret data. Cloudera, one of these such
companies, offers a package called Hadoop, which is basically a software
package that is able to accommodate the overwhelming data. The price? $525,000,
and that doesn't factor in the annual support costs or the installation
fee. Hadoop’s fees rank as some of the
most affordable around. Recall a point earlier about smaller companies having
trouble trying to afford the operating systems necessary to make sense of the
data traffic. Smaller companies simply can’t stomach that kind of a cost, or
don’t see it as a worthy investment in their current standing. The price alone
is enough to create an inequality in the market, providing wealthier and more
developed firms with the ability to gain a competitive advantage in evaluating
the data faster and cheaper.
But
data alone means little in the competitive market if you can’t find a way to
offer its services to your customers. The majority of organizations that have
the ability to fabricate conclusions from the data are able to better
understand consumer trends in spending and demands. The companies that aren’t
able to number-crunch to the same magnitude often outsource their analysis
elsewhere. For example, McKinsey and Company employees thousands of algorithm
analysts and tech geniuses to evaluate and pick over terabytes of data, coming
up with meaningful conclusions for their clients. However, many organizations
lack a comparable human supply of brain power, or rather, it would be too
costly to employ the necessary people to have it done. This leads us to an
interesting dynamic.
A dynamic well put by Nielson, a global information company, which stated “…smaller players have historically relied more heavily on something their larger counterparts could only dream of—a personal relationship with their customers. Today, however, big data isn’t just for the big guys.”[3]
Data
is unique to a specific consumer type, and even to each consumer. Companies
that can individualize their findings, predict future patterns for spending and
investing, and offer strategic planning in real time will soon monopolize the
market for client services. Nielson also stated in their recent poll of over
2000 small business owners that “even with new solutions tailored for small
business, a large portion of owners still lack the expertise and the time to
make good use of the information.”[4] This
explains why the current businesses that are best adapted to analyze big data
are usually government entities or specialty firms. But as always, companies
evolve and adapt, and now that the amount of data transferred between devices
nowadays is more than the entire amount of data transferred during the WWII
years, it's a necessary evolution to stay competitive in the marketplace. Utilizing
this advantage has become a pivotal point for many business owners and CEOs. In
a recent survey of professionals, the software company SAS found that “73 percent of survey respondents
say their collection of data has increased ‘somewhat’ or ‘significantly.’”[5]
For those organizations stated by the survey as “strategic data managers -
those with a well-defined data management strategy that focuses on collecting
and analyzing the most valuable data – tend to financially outperform their
competition more than others - 53
percent, compared with 36 percent.”[6]
Some
companies adopt the new trend of hiring new analysts and data managers who specialize in the collection
and understanding of data, others purchase the computing power to rival (or get
as close to as legally possible) the computers at the NSA or other governmental
entities. Business models are changing, evolving, and becoming more complex, and
data analytics are the base from which to build on. It’s more advantageous to
update your existing data metrics environment to accommodate the influx than
try to run modern logistics on an outdated system. Bringing it full circle back
to Cloudera, Intel just invested over $700M into the company for further
research, but the problem exists where the managers and employees lack the
skills necessary to make sense of the data. Intel Vice-President Jason Waxman
stated in an article that the Hadoop system “isn’t a magic fix…companies are
going to need people who understand what they’re looking for.”[7]
Many
small businesses feel threatened by expensive operating systems, terabytes of
information, and the unique language that data uses. But contrary to popular belief,
smaller companies who are on par with the latest technological advances are
able t bo access the services offered by these larger corporations. The key
word is ‘services,’ they aren’t spending three quarters of a million dollars on
software; they’re simply using pre-existing services to personalize their
client relationships. There will be organizations similar to Cloudera and
Hadoop that are made available to small businesses, because there exists an
obvious market for the services. Smaller companies build their revenue on
interpersonal client relationships, and understanding data is an added benefit
that could maximize profits.
[1] "Big Data for the Real
World." VentureBeat. VentureBeat, 6 Feb. 2014. Web. 04 June 2014.
[2] “New Study Details How
Enterprises Use Big Data | The Big Data Hub." New Study Details How
Enterprises Use Big Data | The Big Data Hub. IBM Information Management, 17
Oct. 2012. Web. 04 June 2014.
[3] "Newswire ." How Small
Businesses Can Scale the Big Data Barrier. Nielson, 18 Mar. 2014. Web. 05 June
2014.
[4] "Newswire ." How Small
Businesses Can Scale the Big Data Barrier. Nielson, 18 Mar. 2014. Web. 05 June
2014.
[5] Troester, Mark. "Small
Business, Big Data." Business Analytics and Business Intelligence
Software. SAS, n.d. Web. 05 June 2014.
[6] Troester, Mark. "Small
Business, Big Data." Business Analytics and Business Intelligence
Software. SAS, n.d. Web. 05 June 2014.
[7] King, Ian. "Big Data Is
Really About Small Things." Bloomberg Business Week. Bloomberg, 4 June
2014. Web. 05 June 2014.
No comments:
Post a Comment