Tag

Big Data

Browsing

By ,

Is there a way for IT leaders to be proactive about AI and machine learning without ruffling and rattling an organization of people who want the miracles of AI and ML delivered tomorrow morning? The answer is yes.

How should IT leaders and professionals go about selecting and delivering the technology required to deliver the storied marvels of artificial intelligence and machine learning? AI and ML require having many moving parts in their right places, moving in the right direction, to deliver on the promise these technologies bring — ecosystems, data, platforms, and last, but not least, people.

 

Is there a way for IT leaders to be proactive about AI and ML without ruffling and rattling an organization of people who want the miracles of AI and ML delivered tomorrow morning? The answer is yes.

The authors of a recent report from MIT Sloan Management Review  and SAS advocates a relatively new methodology to successfully accomplish the delivery AI and ML to enterprises called “ModelOps.” While there a lot of “xOps” now entering our lexicon, such as MLOps or AIOps, ModelOps is more “mindset than a specific set of tools or processes, focusing on effective operationalization of all types of AI and decision models.”

That’s because in AI and ML, models are the heart of the matter, the mechanisms that dictate the assembly of the algorithms, and assure continued business value. ModelOps, which is short for :model operationalization, “focuses on model life cycle and governance; intended to expedite the journey from development to deployment — in this case, moving AI models from the data science lab to the IT organization as quickly and effectively as possible.”

In terms of operationalizing AI and ML, “a lot falls back on IT,” according to Iain Brown, head of data science for SAS, U.K. and Ireland, who is quoted in the report. “You have data scientists who are building great innovative things. But unless they can be deployed in the ecosystem or the infrastructure that exists — and typically that involves IT – – there’s no point in doing it. The data science community and AI teams should be working very closely with IT and the business, being the conduit to join the two so there’s a clear idea and definition of the problem that’s being faced, a clear route to production. Without that, you’re going to have disjointed processes and issues with value generation.”

ModelOps is a way to help IT leaders bridge that gap between analytics and production teams, making AI and ML-driven lifecycle “repeatable and sustainable,” the MIT-SAS report states. It’s a step above MLOps or AIOps, which “have a more narrow focus on machine learning and AI operationalization, respectively,” ModelOps focuses on delivery and sustainability of predictive analytics models, which are the core of AI and ML’s value to the business. ModelOps can make a difference, the report’s authors continue, “because without it, your AI projects are much more likely to fail completely or take longer than you’d like to launch. Only about half of all models ever make it to production, and of those that do, about 90% take three months or longer to deploy.”

Getting to ModelOps to manage AI and ML involves IT leaders and professionals pulling together four key elements of the business value equation, as outlined by the report’s authors.

Ecosystems: These days, every successful technology endeavour requires connectivity and network power. “An AI-ready ecosystem should be as open as possible, the report states. “Such ecosystems don’t just evolve naturally. Any company hoping to use an ecosystem successfully must develop next-generation integration architecture to support it and enforce open standards that can be easily adopted by external parties.”

Data: Get to know what data is important to the effort. “Validate its availability for training and production. Tag and label data for future usage, even if you’re not sure yet what that usage might be. Over time, you’ll create an enterprise inventory that will help future projects run faster.”

Platforms: Flexibility and modularity — the ability to swap out pieces as circumstance change — is key.  The report’s authors advocate buying over building, as many providers have already worked out the details in building and deploying AI and ML models. “Determine your cloud strategy. Will you go all in with one cloud service provider? Or will you use different CSPs for different initiatives? Or will you take a hybrid approach, with some workloads running on-premises and some with a CSP? : Some major CSPs typically offer more than just scalability and storage space, such as providing tools and libraries to help build algorithms and assisting with deploying models into production.”

People: Collaboration is the key to successful AI and ML delivery, but it’s also important that people have a sense of ownership over their parts of the projects. “Who owns the AI software and hardware – the AI team or the IT team, or both? This is where you get organizational boundaries that need to be clearly defined, clearly understood, and coordinated.”  Along with data scientists, a group that is just as important to ModelOps is data engineers, who bring “significant expertise in using analytics and business intelligence tools, database software, and the SQL data language, as well as the ability to consistently produce clean, high-quality, ethical data.”

Feature Image Credit: IBM Media Relations

By

Sourced from ZDNet

Sourced from ElectricBot

Information acted upon is power.

Data is a crucial part of any business. Without it, you would be relying on guesswork and trial and error tactics. Everything revolves around data. Even our five senses are data-gathering machines that help us navigate through life.

Big data has become a buzzword these days. Many B2C and B2B businesses are heavily capitalizing on it. There are different ways in which you can use it to gain a competitive advantage. One big area is sales and marketing.

In this article, we will take a closer look at how big data is reshaping the marketing efforts of B2B companies. There are opportunities but also challenges in this new environment. Next, we will examine the connection between big data and the ABM approach which is the most popular B2B tactics these days.

How can “Big Data” help exactly?

If you had to summarize “big data” in one sentence you could say that big data offers a possibility to create both high impact and highly personalized marketing campaigns by offering deep insights into both the market and the consumer.

By having this deep insight, you will be able to create highly accurate and responsive campaigns. Knowing who to target, when, and with what message and price are the key benefits “big data” provides.

How does this function?

Big data is essentially any collection of data that can’t be effectively analyzed by existing traditional models of data. Since data can be in both structured or unstructured formats, and from numerous channels, you first need to make all the data accessible and usable. After, data analytic techniques can be utilized to effectively process and analyze all of the data. These actions are usually achieved by using a big data platform.

Examples of data sources can be competitive analysis and market information, marketing research, product knowledge data, website traffic data, social media posts, press releases, call center logs, customer feedback, device data, 3rd party data, consumer sentiments, and transaction logs.

How can big data affect marketing?

Marketing and sales departments were historically divided, and they still are. They have different targets and often come to a disagreement regarding several topics. For example, is the lead qualified enough and who’s at fault because they didn’t convert to a customer?

This tendency can be seen across the entire ecosystem of the company. Indeed, a company is just a collection of different systems, much like a large Fortune 500 company operates. In the case of an enterprise, these systems can be the ERP, CRM, Product Information Management System, Order Management System, and Marketing Automation System.

This is the old, so-called, silo approach to departments. Big data analytics helps us to turn all these different systems into a unified framework with a precise collective set of goals. All these systems are sources of data that can be used to create a perfect go-to-market strategy, set up different marketing segments and audiences, or to guide targeted advertising strategies and campaigns.

Big data is reshaping the anatomy of a company. In order to leverage the knowledge and data, all departments of the company should be integrated and interconnected.

Knowing where to focus marketing efforts is important. With big data, you can achieve more with precise targeting, lead generation, and increasing sales. Combined with predictive analytics and AI methods, big data can help us determine where the customers will be, how they may act, and what they may “do”. This responsiveness translates to better demand generation, the ability to create new markets and segments, while enhancing and optimizing both multi-channel and omni-channel marketing experiences.

Customer analytics, search engine optimization (SEO), search engine marketing (SEM), email marketing, Push/SMS/mobile app marketing, and digital ad platforms are just some of the areas where big data brings a competitive advantage.

Opportunities for B2B companies

Providing a unified and exceptional customer experience across all channels is the end goal that will increase your b2b sales. Great “above and beyond” customer experience can be achieved through personalization and customization.

Through personalization, we come to relevance and precise targeting. A simple example is a webpage, offer, coupon, or ad that displays a different message for an existing visitor compared to a new website visitor. If you are running an ecommerce business, you can deliver a perfectly timed discount, cross-sell, or up-sell campaign. This strategy or variation thereof, can apply to many types of businesses, webshops and digital storefronts and marketplaces are typically the platforms that get the most out of these personalization marketing strategies.

Another example is the ability to customize the price on a customer-product level. Again, highly valuable for any B2B eCommerce business since each customer is different. This wouldn’t be possible without the automated stream of data. You can also optimize the order and reorder processes by using the information about the previous purchase.

Some companies structure their entire business around data to produce something called the ultimate product. The ultimate product or service is that which came as a direct response to customer insight. By analyzing every piece of information to determine the most beneficial set of features, the product is guaranteed to be a success (if everything else goes well).

Challenges for B2B companies

If you compare a B2B and B2C market you will spot a tangible difference. Whereas you can have millions of users in a B2C market you have a lot less in a typical B2B setting. This in effect is the main challenge for B2B companies.

The number of customers influences the quantity of data, and the quantity of data influences the ability to draw conclusions. However, eCommerce businesses are an exception here. Although still having far fewer customers than a B2C store they can produce a large enough pool of actionable data. Again, this is not a rule.

Big Data and ABM strategy

There is an interesting aspect of a B2B market...

The number of customers is not large, they usually require highly customizable services, and prices will differ from one account to another. The usual approach of segmenting different customers into customer groups starts to lose its purpose since every customer is an island for itself. This line of thought has led to the creation of the ABM approach.

The ABM approach is the most popular B2B tactic. It stands for Account-Based Marketing. If you are not familiar with it, ABM is a business marketing concept that treats every account as a market. This approach represents a shift from the usual MQL (marketing qualified leads) strategy.

The major prerequisite for ABM is to have good, quality data. To identify key accounts and targets companies, while gathering strong firmographic and demographic data. After they have identified their markets they should reach them across a variety of channels and deliver relevant content.

Can you see the pattern? ABM is the most successful approach for small SMEs and other B2B players and it will almost always translate to an increase in B2B sales. However, that’s not possible if you can’t obtain quality data. The kind that big data platforms can generate.

Conclusion

Every company in every age has its own unique challenges and opportunities. We all know what happened to those who acted upon them. Luckily with the aid of the right technology, even the relatively small B2B companies have a chance to compete against bigger players.

The key is to have good quality data. 

The last few years have seen a growing trend among marketers. There is a rising demand for data quality. Each year more marketers list the quality of their data as a determining factor of their go-to-market strategy.

If you don’t know where to start from, we as a digital marketing agency can help you.

We are experts for B2B digital marketing and our main goal is to help you set up a successful ABM strategy, which you can’t do without quality data.

Everything is connected! Do you think you got what it takes to be the winner?

Sourced from ElectricBot

By Paul Matthews

In 2018, the world has been shaken by the usage of big data: the Cambridge Analytica scandal, which was related to the allegedly illegal buying and selling process of data points and data-related pieces from the British company, has put data science into the spotlight of the “mainstream business” world. After this scandal, in fact, data has surpassed oil as the most valuable asset on Earth. Let’s analyse why and, most importantly, how this has happened.

Data Points: A Commercially Powerful Numerical Value

For “data point”, we intend a numerical value which, when associated with a specific entity (i.e. a person, a company), combines preferences, comments and tastes (from a numerical perspective) in order for a software to automatically elaborate them. The power of data points stands in the fact that, when properly analysed, they could give thorough insights on a particular user’s preference on a specific topic. The “exploitation” of Facebook searches on the Brexit topic, for example, was elaborated using data points to provide highly tailored ads to the people who were either searching for “leave the UK” and related keywords. Although this may sound slightly political, it was actually confirmed by Cambridge Analytica itself last year after they (and Facebook) were fined for over $2 billion for buying and selling private pieces of information (data points).

Data Science: An Enterprise Niche Sector Going Mainstream

The possibility of creating tailored ads based on numerical values has intrigued business owners worldwide to the point in which they decided to open data science-related divisions in companies which weren’t exactly at an “enterprise” level. Data elaboration, acquisition, science and Python development professional figures have been recruited in small and medium companies worldwide massively, in the past 7 months. Despite a specific GDPR section strictly regulating data acquisition and processing, data scientists have definitely “gone mainstream” in the recent past.

From fintech to eCommerce, to pure lead generation, the usage of data science has become a constant in 2019.

Some Business Sectors Have Been Getting More Results Than Others…

As mentioned above, data processing and science have been used by a variety of businesses in the past months. Fintech and real estate have been the most successful ones, in terms of lead generation tailored onto data. Sectors like bridging loans, development finance and similar have seen a net 35% increase in organic investment in terms of hiring Python developers who were able to process such delicate data to prepare targeted, tailored and highly convertible ads for social media channels. Lead generation has become very dependant on data in the recent past.

To Conclude

The usage of data in 2019 has definitely become a mainstream procedure. In the nearest future, we can safely say that GDPR rules will become even more strict: with more specific regulations on the acquisition and storing, data is still far away from being fully regulated.

By Paul Matthews

Paul Matthews is a Manchester-based business and tech writer who writes in order to better inform business owners on how to run a successful business. You can usually find him at the local library or browsing Forbes’ latest pieces. Paul is currently consulting a bridging loans company in Manchester.

Sourced from IT Brief

Tableau Software announced the general availability of Ask Data, which leverages the power of natural language processing to enable people to ask data questions in plain language and instantly get a visual response right in Tableau.

This patent-pending capability makes it easier for people, regardless of skill set, to engage with data and produce analytical insights they can share with others without having to do any setup or programming. Ask Data is available as part Tableau’s newest release, Tableau 2019.1.

Customers can simply type a question such as, “What were my sales this month?,” and Tableau will return an interactive visualisation that they can continue to explore without limits, either with iterations, new questions, or drag and drop gestures.

There is no need to have a deep understanding of the data structure, no setup required and no programming skills necessary. Ask Data uses sophisticated algorithms that are driven by an understanding of the person’s intent, not keywords, which helps Tableau return more relevant results.

Ask Data technology translates simple questions into analytical queries

Ask Data supposedly uses sophisticated algorithms that are driven by an understanding of the person’s intent, not keywords, which helps to understand a person’s question, anticipate needs, and allow for smart visualisation selection.

For example, when someone types in “APAC furniture” for their sales data, they want to filter “Product Name” to “Furniture,” and “Region” to “Asia Pacific.” Ask Data combines statistical knowledge about a data source with contextual knowledge about real-world concepts: “Furniture” is a common value for the “Product Name” field and “APAC” is an acronym of “Asia Pacific.”

Additionally, Ask Data’s parser automatically cuts through ambiguous language, making it easy for people to ask questions in a natural, colloquial way. This means, if a question could be interpreted multiple ways, Ask Data will combine knowledge about the data source with past user activity and present a number of valid options to choose from, with the ability to refine the results if needed.

Tableau expands platform with prep conductor in new data management package

Tableau also announced today the general availability of Prep Conductor, a new product that enables organisations to schedule and manage self-service data preparation at scale with no programming or complicated setup. Tableau Prep Conductor is part of a new subscription package called Tableau Data Management.

Prep Conductor supposedly automates flows created in Prep Builder (the renamed Tableau data prep product).  Prep Builder is already in use by more than 11,000 customer accounts, and Prep Conductor will help organisations use it more broadly to ensure that clean and analysis-ready data is always available.

It also gives people greater confidence in their data by providing added visibility and detail behind cleaning history and data connections, as well as alerts when processes are not operating as scheduled.

Tableau Prep Conductor gives IT the ability to monitor and set up automatic cleaning processes (flows) for their data across the entire server. It also allows customers to build permissions specifically around data flows and data sources in order to maintain control and meet data compliance standards and policies.

Feature Image Credit: Getty

Sourced from IT Brief

By Charles Babcock.

Forrester Research says competitive advantage will follow big data analytics’ move to the cloud; join in, or get left behind.

Failure to move your big data into the cloud may prove to be “an extinction level event” for companies that are on the verge of becoming digital dinosaurs, says Brian Hopkins, analyst with Forrester Research.

His June 15 report, Move Your Big Data Into The Public Cloud: You Won’t Be Able To Keep Up With Customers If You Don’t, concludes that companies that wish to be competitive in 2020 need to make use of their big data analytics in the public cloud. He was assisted by Srividya Sridharan, John Rymer, Boris Evelson, Dave Bartoletti and Christian Austin. The report is not publicly available. The Forrester summary of it is here.

“The migration of data and analytics to the public cloud that began in 2016 is still going strong in 2017 and will continue in 2018,” said the authors.

The use of big data in the cloud is an example of the force of Moore’s Law. Its usefulness will accelerate there through repeated reductions in compute cost versus gains in analytical systems’ power. Firms that are not leveraging the public cloud for big data analytics will be hard-pressed by 2020 to keep pace, was a key conclusion of the report.

“You must immediately shift your big data investment from on-premises or hybrid toward public cloud,” Hopkins and co-researchers added as a key takeaway.

Want to how big data is part of the bigger picture? See The Need to Go Digital Is Clear; Not Everyone Can Get There.

What’s wrong with on-premises analytics for a company’s big data? The Forrester report said the capital expenditures made will lock the big data users into the systems selected “when they need to be flexible.”

In addition, an internal staff will not be able to keep pace with the unpredictable change requirements that will keep popping up. The staff’s existing skills will determine what changing technologies they dare adopt and even recognized, promising ones “are unlikely to be adopted at scale fast enough,” the Forrester researchers wrote.

Source: Pixabay

Source: Pixabay

Amazon Web Services, Google, IBM, Microsoft and Salesforce, on the other hand, will continue to invest heavily in big data and rapidly expand their services in competition with one another. In the public cloud, one successful new system gets leveraged by others. The report quoted Dr. Marcin Poetrzyk, head of analytics at Swisscon, as saying, “Concepts such as serverless computing can bring (big data analytics) to the next level. Public cloud vendors can scale innovation better than their on-premises competitors.”

There’s no shortage of blue-sky thinking when it comes to organizations’ strategic plans and road maps. Why not ditch impractical thinking and build a data strategy with realistic goals and measurable deliverables?
The authors warned that what might look at this stage like a gradual shift could become a runaway freight train. “Early on the shift seems slow and firms think that they have time to react. As costs drop and power doubles, the unprepared get left behind. This is the basic plot of every disruption story ever told….”

Enterprise architects understand the advantages in the public cloud. Nevertheless, they are likely to continue to recommend on-premises investment because they foresee a big total cost of ownership in the public cloud over a five or ten-year period. But researchers Hopkins and peers project that competition in the public cloud will keep cutting prices in half every few years for both compute and storage. The falling costs “will make the public economic incentives irresistible,” they wrote.

Resistors will cite compliance, data security, liability and brand perception as reasons not to go into the public cloud. The longer they delay, the more advantage will pass to those who adopt public cloud analytics early. “By 2020, firms that are not fully leveraging the public cloud for big data analytics will be hard pressed to keep the pace set by digital leaders….” they wrote.

As storage and compute double every few years for the same price, “leaders will innovate faster, dealing a death blow to laggards,” the analysts warned.

Forrester based its report on a big data survey conduct last year along with more recent interviews with American Express, Bose, Walmart, Amazon Web Services, IBM, Hortponworks, Logitech, Databricks, GoodData and Qubole.

By Charles Babcock

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive … View Full Bio

Sourced from Information Week