Tag

machine learning

Browsing

By Deepak Bansal

Digital marketing is a rapidly evolving field, and as we move toward 2025, several emerging trends are set to reshape how brands connect with their audiences.

The upcoming years promise significant innovation, driven by technological advancements, changes in consumer behaviour and a deepening focus on personalization and data. As digital marketers, staying ahead of these trends is essential to create impactful strategies that drive engagement and results.

Here are six key trends to watch in 2025 and beyond.

1. Artificial Intelligence And Machine Learning In Marketing

Artificial intelligence (AI) and machine learning (ML) have already transformed digital marketing, and I predict their role will become even more prominent, further optimizing everything from customer segmentation to content creation.

Chatbots and virtual assistants continue to evolve, soon providing more human-like interactions. As AI tools become more intuitive and accessible, businesses can automate routine tasks like lead nurturing and email marketing, freeing up human teams for more strategic work.

To prepare, marketing teams should develop skills in AI-driven content tools like Jasper and ChatGPT, particularly when it comes to customer segmentation and content personalization. I believe it will become important to have some familiarity with AI-driven customer service tools, such as chatbots equipped with sentiment analysis and natural language processing (NLP) features.

Overall, keep an eye on AI advancements in personalized user interactions that can interpret and respond to customer sentiment. This shift will allow brands to offer a more customized customer experience, making AI an essential asset in customer relations.

2. Voice Search And Voice Commerce

With the proliferation of smart speakers like Amazon’s Alexa, Google Home and Apple’s Siri, voice search has become a mainstream method for information gathering. A recent report by NPR and Edison Research shows that at least 35% of U.S. households now own a smart speaker, accelerating the shift toward voice commerce.

It’s important to note how voice search optimization differs from traditional SEO as users ask questions conversationally. Instead of typing “best coffee shops in Seattle,” a voice search might be “What are the best coffee shops near me?” Brands should focus on long-tail keywords and natural language to capture this growing audience.

3. The Rise Of Augmented Reality (AR) And Virtual Reality (VR)

While AR and VR were once primarily associated with gaming, I see them now transforming industries like retail, allowing for immersive shopping experiences.

Major brands like IKEA and Sephora have already implemented AR. IKEA’s AR app allows users to view furniture in their own space, while Sephora’s AR technology lets users experiment with virtual makeup try-ons. These applications are reshaping customer expectations and proving AR’s utility across retail sectors.

For brands looking to embrace AR and VR, consider investing in an AR experience platform that aligns with your industry. For example, fashion and beauty brands could explore virtual try-ons, while real estate companies might benefit from virtual property tours. Early adoption can enhance customer engagement and differentiate brands in competitive markets.

4. The Continued Dominance Of Video Content

Video content remains a powerful force in digital marketing, with platforms like YouTube, TikTok and Instagram Reels focusing on short-form, interactive formats.

As consumer preferences shift toward snackable, engaging content, brands can use video to deliver information quickly and creatively. Here are my tips for creating engaging video content that speaks to developing trends:

• Embrace short-form and live streaming. Use live streaming for real-time engagement, which is ideal for launches or Q&As. On top of this, short-form platforms like TikTok and YouTube Shorts can boost reach. I find that tools like InShot and Canva can help simplify quality video creation.

• Adapt to new platform offerings to encourage interaction. With new features on YouTube Shorts, Instagram Reels and TikTok, brands can leverage interactive tools (e.g., polls, live Q&As) to engage viewers. Regularly analyse metrics to refine content strategies based on what resonates most in each niche.

5. Personalization At Scale

Consumers expect personalized experiences across all digital channels. By 2025, I foresee personalization advancing beyond basic customization and enabling brands to deliver hyper-personalized content and recommendations.

To prepare for this increasing focus into hyper-personalization, I first recommend you invest in dynamic content platforms. Brands can consider platforms like HubSpot or Marketo that offer advanced personalization features. Look for ways to create dynamic content adjustments that reflect user data, ensuring messages are relevant to each visitor.

AI-driven personalization can also allow your brand to design user journeys that proactively meet customer needs. Develop campaigns that consider various stages of the buyer journey, from interest to decision-making, for highly relevant interactions.

6. Social Commerce and Shoppable Content

The line between social media and e-commerce is blurring, with platforms like Instagram, Facebook and Pinterest integrating in-app shopping features. This development allows consumers to purchase products directly from their social media feeds, creating a seamless browsing-to-purchase experience.

To optimize for social commerce:

• Create engaging, shoppable content. Focus on visually appealing and interactive content that encourages sharing. Use shoppable posts on Instagram and Facebook to streamline the buying process and improve conversion rates.

• Partner with influencers. Collaborate with influencers or incorporate user-generated content to broaden reach. This strategy helps build credibility and connects your brand with new audiences.

Preparing For 2025 And Beyond

To stay competitive, I believe marketers need to embrace emerging technologies, prioritize personalization and adapt to shifts in consumer behavior. AI, voice search, AR and video will dominate digital marketing in 2025, while data privacy and sustainability will become essential for shaping customer relationships.

I think the brands that thrive are the ones that blend innovation with authenticity, creating meaningful, personalized experiences that resonate with consumers. By staying ahead of these trends, digital marketers can craft impactful campaigns that build lasting connections with their audiences.

Feature Image Credit: Getty

By Deepak Bansal

Follow me on LinkedIn. Check out my website.

Deepak Bansal, Director of Digital Marketing, Atihsi LLC and CEO & Founder, Clearpath Technology Pvt Ltd. Read Deepak Bansal’s full executive profile here.

Sourced from Forbes

By Jack Kelly

As we stand on the cusp of a new year, the job market continues to evolve at an unprecedented pace, driven by technological advancements and shifting economic realities. In this dynamic environment, professionals across all industries are recognizing the critical importance of upskilling and reskilling to remain competitive and relevant.

The coming year presents a golden opportunity to invest in yourself by acquiring the in-demand skills that employers are actively seeking, ensuring you’re well-positioned for career growth and new opportunities in an increasingly digital and automated world.

The rapid acceleration of digital transformation, catalysed by recent global events, has reshaped the way businesses operate and the skills they require from their workforce. From artificial intelligence and data analytics to cloud computing and cybersecurity, the demand for tech-savvy professionals continues to soar across sectors.

In-Demand Hard Skills For The New Year

As traditional job roles evolve and new positions emerge, the ability to learn and adapt quickly has become a critical asset in itself. By proactively developing these in-demand hard skills, you not only enhance your marketability but also position yourself to thrive in the face of future disruptions and opportunities in the job market.

1. Artificial Intelligence and Machine Learning

AI and machine learning are becoming indispensable skills in the job market, with their importance growing exponentially across industries. The demand for AI-related skills is 3.5 times higher than the average job skill, reflecting the rapid integration of these technologies in various sectors, a PwC report revealed.

This surge in demand is driven by the transformative potential of AI and ML in the workplace. This fast-emerging technology is expected to automate up to 300 million jobs in the United States and Europe, according to investment bank Goldman Sachs, while simultaneously creating 97 million new roles that require advanced technical skills, as predicted by the World Economic Forum. This shift is not just about job displacement; it’s about job evolution. Companies adopting AI are planning to expand their workforce, with 91% of firms integrating AI aiming to increase their employee numbers by 2025.

2. Cloud Computing

Cloud computing skills will remain in high demand, as the industry continues its explosive growth and transformation of business operations across sectors. Gartner forecasts global end-user cloud spending to reach $723 billion in 2025, a 21.5% increase from the previous year.

The rise of generative AI and the need for integrated platforms are accelerating cloud adoption, with 90% of organizations projected to have hybrid cloud deployments by 2027. As organizations continue to migrate their applications and workloads to the cloud, with 48% planning to move at least half of their applications within a year, proficiency in cloud computing will be crucial for professionals looking to stay relevant in the rapidly evolving job market of 2025.

3. Cybersecurity

Cybersecurity skills are highly coveted, as the digital landscape faces unprecedented threats and skyrocketing costs associated with cybercrimes. By 2025, global cybercrime costs are projected to reach a staggering $10.5 trillion annually, according to a report by Cybercrime Magazine.

This surge in cybercrime is accompanied by a severe shortage of qualified professionals in the field. The cybersecurity job market is expected to grow by 33% between 2023 and 2033, with an estimated 3.5 million unfilled cybersecurity positions worldwide by the end of 2025. This talent gap is further exacerbated by the rapid evolution of cyber threats, with encrypted threats increasing by 92% in 2024 and malware rising by 30% in the first half of the same year.

4. Data Analysis

Businesses are increasingly relying on transforming unstructured data into actionable insights to drive growth, improve user satisfaction and maintain a competitive edge in the market. The demand for data analytics expertise is surging across industries, with trends like AI-enhanced analytics, natural language processing and advanced data visualization reshaping how organizations leverage their data assets.

As organizations grapple with the challenges of data quality and governance, professionals skilled in ensuring data integrity and implementing effective data strategies will be in high demand, making data analysis an essential skill.

5. Digital Marketing

In today’s digital landscape, businesses are leveraging online social platforms to connect with and engage their target audiences and customers.

With global digital ad spending projected to surpass $740 billion in 2024, and over 5 billion social media users worldwide, proficiency in digital marketing strategies will be crucial for professionals looking to thrive in the competitive job market.

Feature Image Credit: Getty

By Jack Kelly

Follow me on Twitter or LinkedIn. Check out my website or some of my other work here.

Jack Kelly has been a senior contributor for Forbes since 2018, covering topics in career development, job market trends and workplace dynamics. His articles often focus on practical advice for job seekers and employees, as well as covering the latest news impacting workers so they can make informed decisions about their careers. Read More

Sourced from Forbes

By Alessio Francesco Fedeli

The current landscape of digital technology is marked by the struggle to achieve visibility for your business online and target the appropriate audience amidst a wave of competition. Search engine marketing (SEM) has pivotal strategies that will allow a business to achieve this but with ongoing advancements in artificial intelligence (AI) and machine learning, more marketers have opportunities for maximum growth. These advancements are revolutionising SEM and will help enhance the efficiency and effectiveness of business campaigns significantly.

AI-enhanced SEM tools stand at the vanguard of this revolution, utilizing advanced algorithms and machine learning capabilities to transform every facet of search engine marketing comprehensively. From automating the process of keyword research to refining advertisement creation, and from optimising bid management to improving performance analysis, these tools furnish marketers with the capacity to attain exceptional outcomes. They transcend conventional tool functionality; they act as catalysts for change, facilitating precise targeting and real-time modifications previously considered unattainable.

Exploring further into AI and machine learning within SEM reveals that these technologies are not only augmenting existing methodologies but also fostering novel strategies. Marketers harnessing these tools gain the ability to predict market trends accurately, comprehend consumer behaviour with enhanced precision, and implement campaigns that are both cost-efficient and high-impact. The advent of AI-driven SEM marks a transformative era in digital advertising, reshaping the landscape in ways that are beginning to unfold.

Leveraging AI and machine learning in SEM

Leveraging AI and machine learning can revolutionise your campaigns | News by Thaiger
Photo by Steve Johnson on Unsplash

The Role of AI in search engine marketing

AI revolutionises SEM by making complex tasks simple. It sifts through vast datasets to unearth insights beyond human capability. By fine-tuning keyword research and bid optimisation, AI ensures ads hit the mark every time. It doesn’t stop there; AI tailors ad content for individual users, predicting trends and making swift, informed decisions. This not only sharpens the marketer’s toolbox but also enhances the consumer’s journey, significantly boosting conversion rates. With AI in SEM, ads become more than just noise; they’re strategic moves in the digital marketplace.

Benefits of Using Machine Learning in SEM

Although there is some apprehension from some, it is important to understand that there are benefits to incorporating machine learning into your SEM strategy.

Benefits of machine learning in SEM

BENEFIT DESCRIPTION
Enhanced targeting accuracy By analysing user data, machine learning identifies the most relevant audience segments, improving the precision of targeting efforts.
Optimised bid adjustments Machine learning algorithms navigate the volatile bidding landscape, making real-time adjustments to maximize ROI.
Improved ad performance It analyses what works best for ad performance, from copy to design, ensuring optimal engagement and conversion rates.
Fraud detection and protection Machine learning acts as a guardian against click fraud, safeguarding advertising budgets from dishonest practices by spotting and mitigating fraudulent activities.

This integration offers strategic advantages that will enable marketers to be more effective in this competitive digital landscape. However, by implementing machine learning, businesses can not only optimise their advertising efforts but also protect their investments. This way, every dollar spent is an investment towards achieving tangible results.

Incorporating AI and machine learning technologies in SEM campaigns

Choosing the right AI tools is the first step to SEM success. The ideal tool offers a comprehensive suite for managing keywords, bids, ads, and performance, fitting seamlessly into your marketing stack. On the machine learning front, clarity in objectives paves the way for impactful integration. Whether aiming for higher CTRs or lower CPA, leveraging historical data and machine learning algorithms to predict and adjust is key. Constant experimentation and analysis refine strategies, ensuring SEM campaigns not only meet but exceed expectations. In the rapidly evolving world of SEM, AI and machine learning are not just options but necessities.

Strategies for successful implementation

Leveraging AI and machine learning can revolutionise your campaigns | News by Thaiger
This photo was generated using Dall-E

In the evolving landscape of search engine marketing (SEM), leveraging AI and machine learning can set a campaign apart, maximising efficiency and returns. Below are strategies detailing how to integrate these advanced technologies effectively.

Choosing the right AI tools for SEM

In the realm of SEM, it is critical to select AI tools that are congruent with your marketing objectives. The market is replete with a myriad of options, each purporting to transform your SEM strategies radically. Nonetheless, not every tool offers equal value. It is advisable to opt for tools that provide an extensive analysis of keywords, insights into competitors, and capabilities for automated bid management. These functionalities ensure that your campaigns are both precisely targeted and economically efficient. Furthermore, the implementation of AI-driven tools for content optimisation can notably increase ad relevance, thereby enhancing click-through rates (CTR) and reducing cost per acquisition (CPA).

Conducting trials with various tools before finalizing a decision is imperative to identify a solution that is specifically catered to your requirements. Platforms offering advanced analytics should be given priority as they afford actionable insights critical for ongoing refinement. It is important to recognize that the effective use of AI in SEM transcends merely selecting cutting-edge technology; it encompasses the strategic application of these tools to continually refine and advance marketing strategies over time.

Integrating machine learning algorithms into SEM practices

Machine learning algorithms come in as a cornerstone in the advancement of search engine marketing (SEM) strategies. With this, businesses can gain insights into consumer behaviour and preferences and to capitalise on this, it will be important to integrate it.

Machine learning algorithms constitute a cornerstone in the advancement of Search Engine Marketing (SEM) strategies, offering unprecedented insights into consumer behaviour and preferences. To capitalize on this opportunity, it is essential to integrate machine learning SEM technologies, emphasizing predictive analytics. Such an approach enables a deeper understanding of the interactions between different demographics and your advertisements, thereby improving audience segmentation.

Moreover, machine learning capabilities enable the automation of the most labour-intensive tasks within SEM, including bid management and A/B testing. This automation not only conserves precious time but also markedly elevates the efficiency of marketing campaigns. By adapting SEM practices to incorporate these algorithms, advertisements are perpetually optimised for performance, obviating the need for continuous manual intervention.

The fusion of machine learning’s predictive analytics with AI-enabled creative optimisation represents a pivotal evolution in Search Engine Marketing (SEM) strategies. This integrative approach allows for the real-time modification of advertisement components, including imagery and text, to better match user intentions, thereby markedly enhancing campaign outcomes.

Employing machine learning and AI within SEM goes beyond simply embracing cutting-edge technology; it denotes an ongoing dedication to a cycle of testing, education, and improvement. This dedication positions marketing endeavours at the vanguard of innovation during a period marked by rapid digital change.

Measuring success and ROI

Leveraging AI and machine learning can revolutionise your campaigns | News by Thaiger
Photo by krakenimages on Unsplash

Utilising metrics and KPIs to evaluate AI and machine learning impact

The integration of Artificial Intelligence (AI) and Machine Learning (ML) into Search Engine Marketing (SEM) strategies has profoundly altered the approaches utilized by digital marketing experts.

  • For an accurate assessment of the effectiveness of these advanced SEM technologies, focusing on relevant metrics and Key Performance Indicators (KPIs) is essential.
  • These criteria provide a transparent evaluation of the performance enhancements brought about by AI and ML.
  • They enable organizations to measure success and calculate Return on Investment (ROI) with greater accuracy.

Primarily, conversion rates emerge as a crucial metric. They serve as direct indicators of the efficiency of AI-enhanced ad targeting and bid management strategies, reflecting whether such technological advancements result in an increased proportion of visitors performing desired actions, such as completing purchases or registering for newsletters.

Cost per Acquisition (CPA) represents another fundamental metric. It illustrates the effectiveness with which AI and ML tools manage advertising expenditures to secure new clientele. Reduced CPA values indicate that these advanced SEM technologies are not only pinpointing the appropriate audience but also achieving this in a financially prudent manner.

Click-through rates (CTR) hold significant importance as well. An elevated CTR signifies that the predictive analytics and automated content optimisation facilitated by AI are effectively engaging the target demographic, thereby increasing their propensity to interact with advertisements.

Moreover, Return on Ad Spend (ROAS) is an essential measure of overall operational efficacy. It quantifies the revenue generated for every unit of currency expended on SEM initiatives. An enhancement in ROAS denotes that integrating AI and ML into SEM strategies is yielding more lucrative campaigns.

Through meticulous observation of these metrics, organizations can comprehensively assess the impact of Artificial Intelligence (AI) and Machine Learning (ML) on their Search Engine Marketing (SEM) strategies. This analysis highlights not only the achievement of set goals but also identifies potential areas for enhancement. As AI and ML evolve, securing a competitive advantage in SEM requires ongoing vigilance and an adaptable methodology informed by data-driven insights.

Utilising machine learning and AI is pretty important in the pursuit of finding success in digital marketing. However, SEM is just one aspect of marketing that stands shoulder to shoulder with methods like SEO. Knowing the difference between these two will help determine which one to use or utilise together to have a more prosperous digital marketing campaign.

Feature Image Credit: This photo was generated using Dall-E

By Alessio Francesco Fedeli

Graduating from Webster University with a degree of Management with an emphasis on International Business, Alessio is a Thai-Italian with a multicultural perspective regarding Thailand and abroad. On the same token, as a passionate person for sports and activities, Alessio also gives insight to various spots for a fun and healthy lifestyle.

Sourced from Thaiger

By

Unstructured text and data are like gold for business applications and the company bottom line, but where to start? Here are three tools worth a look.

Developers and data scientists use generative AI and large language models (LLMs) to query volumes of documents and unstructured data. Open source LLMs, including Dolly 2.0, EleutherAI Pythia, Meta AI LLaMa, StabilityLM, and others, are all starting points for experimenting with artificial intelligence that accepts natural language prompts and generates summarized responses.

“Text as a source of knowledge and information is fundamental, yet there aren’t any end-to-end solutions that tame the complexity in handling text,” says Brian Platz, CEO and co-founder of Fluree. “While most organizations have wrangled structured or semi-structured data into a centralized data platform, unstructured data remains forgotten and underleveraged.”

If your organization and team aren’t experimenting with natural language processing (NLP) capabilities, you’re probably lagging behind competitors in your industry. In the 2023 Expert NLP Survey Report, 77% of organizations said they planned to increase spending on NLP, and 54% said their time-to-production was a top return-on-investment (ROI) metric for successful NLP projects.

Use cases for NLP

If you have a corpus of unstructured data and text, some of the most common business needs include

  • Entity extraction by identifying names, dates, places, and products
  • Pattern recognition to discover currency and other quantities
  • Categorization into business terms, topics, and taxonomies
  • Sentiment analysis, including positivity, negation, and sarcasm
  • Summarizing the document’s key points
  • Machine translation into other languages
  • Dependency graphs that translate text into machine-readable semi-structured representations

Sometimes, having NLP capabilities bundled into a platform or application is desirable. For example, LLMs support asking questions; AI search engines enable searches and recommendations; and chatbots support interactions. Other times, it’s optimal to use NLP tools to extract information and enrich unstructured documents and text.

Let’s look at three popular open source NLP tools that developers and data scientists are using to perform discovery on unstructured documents and develop production-ready NLP processing engines.

Natural Language Toolkit

The Natural Language Toolkit (NLTK), released in 2001, is one of the older and more popular NLP Python libraries. NLTK boasts more than 11.8 thousand stars on GitHub and lists over 100 trained models.

“I think the most important tool for NLP is by far Natural Language Toolkit, which is licensed under Apache 2.0,” says Steven Devoe, director of data and analytics at SPR. “In all data science projects, the processing and cleaning of the data to be used by algorithms is a huge proportion of the time and effort, which is particularly true with natural language processing. NLTK accelerates a lot of that work, such as stemming, lemmatization, tagging, removing stop words, and embedding word vectors across multiple written languages to make the text more easily interpreted by the algorithms.”

NLTK’s benefits stem from its endurance, with many examples for developers new to NLP, such as this beginner’s hands-on guide and this more comprehensive overview. Anyone learning NLP techniques may want to try this library first, as it provides simple ways to experiment with basic techniques such as tokenization, stemming, and chunking.

spaCy

spaCy is a newer library, with its version 1.0 released in 2016. spaCy supports over 72 languages and publishes its performance benchmarks, and it has amassed more than 25,000 stars on GitHub.

“spaCy is a free, open-source Python library providing advanced capabilities to conduct natural language processing on large volumes of text at high speed,” says Nikolay Manchev, head of data science, EMEA, at Domino Data Lab. “With spaCy, a user can build models and production applications that underpin document analysis, chatbot capabilities, and all other forms of text analysis. Today, the spaCy framework is one of Python’s most popular natural language libraries for industry use cases such as extracting keywords, entities, and knowledge from text.”

Tutorials for spaCy show similar capabilities to NLTK, including named entity recognition and part-of-speech (POS) tagging. One advantage is that spaCy returns document objects and supports word vectors, which can give developers more flexibility for performing additional post-NLP data processing and text analytics.

Spark NLP

If you already use Apache Spark and have its infrastructure configured, then Spark NLP may be one of the faster paths to begin experimenting with natural language processing. Spark NLP has several installation options, including AWS, Azure Databricks, and Docker.

“Spark NLP is a widely used open-source natural language processing library that enables businesses to extract information and answers from free-text documents with state-of-the-art accuracy,” says David Talby, CTO of John Snow Labs. “This enables everything from extracting relevant health information that only exists in clinical notes, to identifying hate speech or fake news on social media, to summarizing legal agreements and financial news.

Spark NLP’s differentiators may be its healthcare, finance, and legal domain language models. These commercial products come with pre-trained models to identify drug names and dosages in healthcare, financial entity recognition such as stock tickers, and legal knowledge graphs of company names and officers.

Talby says Spark NLP can help organizations minimize the upfront training in developing models. “The free and open source library comes with more than 11,000 pre-trained models plus the ability to reuse, train, tune, and scale them easily,” he says.

Best practices for experimenting with NLP

Earlier in my career, I had the opportunity to oversee the development of several SaaS products built using NLP capabilities. My first NLP was an SaaS platform to search newspaper classified advertisements, including searching cars, jobs, and real estate. I then led developing NLPs for extracting information from commercial construction documents, including building specifications and blueprints.

When starting NLP in a new area, I advise the following:

  • Begin with a small but representable example of the documents or text.
  • Identify the target end-user personas and how extracted information improves their workflows.
  • Specify the required information extractions and target accuracy metrics.
  • Test several approaches and use speed and accuracy metrics to benchmark.
  • Improve accuracy iteratively, especially when increasing the scale and breadth of documents.
  • Expect to deliver data stewardship tools for addressing data quality and handling exceptions.

You may find that the NLP tools used to discover and experiment with new document types will aid in defining requirements. Then, expand the review of NLP technologies to include open source and commercial options, as building and supporting production-ready NLP data pipelines can get expensive. With LLMs in the news and gaining interest, underinvesting in NLP capabilities is one way to fall behind competitors. Fortunately, you can start with one of the open source tools introduced here and build your NLP data pipeline to fit your budget and requirements.

Feature Image Credit: TippaPatt/Shutterstock

By

Isaac Sacolick is president of StarCIO and the author of the Amazon bestseller Driving Digital: The Leader’s Guide to Business Transformation through Technology and Digital Trailblazer: Essential Lessons to Jumpstart Transformation and Accelerate Your Technology Leadership. He covers agile planning, devops, data science, product management, and other digital transformation best practices. Sacolick is a recognized top social CIO and digital transformation influencer. He has published more than 900 articles at InfoWorld.com, CIO.com, his blog Social, Agile, and Transformation, and other sites.

Sourced from InfoWorld

By

In this post, you will learn to clarify business problems & constraints, understand problem statements, select evaluation metrics, overcome technical challenges, and design high-level systems.

LinkedIn feed is the starting point for millions of users on this website and it builds the first impression for the users, which, as you know, will last. Having an interesting personalized feed for each user will deliver LinkedIn’s most important core value which is to keep the users connected to their network and their activities and build professional identity and network.

LinkedIn’s Personalized Feed offers users the convenience of being able to see the updates from their connections quickly, efficiently, and accurately. In addition to that, it filters out your spammy, unprofessional, and irrelevant content to keep you engaged. To do this, LinkedIn filters your newsfeed in real-time by applying a set of rules to determine what type of content belongs based on a series of actionable indicators & predictive signals. This solution is powered by Machine Learning and Deep Learning algorithms.

In this article, we will cover how LinkedIn uses machine learning to feed the user’s rank. We will follow the workflow of a conventional machine learning project as covered in these two articles before:

The machine learning project workflow starts with the business problem statement and defining the constraints. Then it is followed by data collection and data preparation. Then modeling part, and finally, the deployment and putting the model into production. These steps will be discussed in the context of ranking the LinkedIn feed.

How LinkedIn Uses Machine Learning To Rank Your Feed 

LinkedIn / Photo by Alexander Shatov on Unsplash

1. Clarify Business Problems & Constraints

1.1. Problem Statement

Designing a personalized LinkedIn feed to maximize the long-term engagement of the user. Since the LinkedIn feed should provide beneficial professional content for each user to increase his long-term engagement. Therefore it is important to develop models that eliminate low-quality content and leave only high-quality professional content. However, it is important, not overzealous about filtering content from the feed, or else it will end up with a lot of false positives. Therefore we should aim for high precision and recall for the classification models.

We can measure user engagement by measuring the click probability or known as the ClickThroughRate (CTR). On the LinkedIn feed, there are different activities, and each activity has a different CTR; this should be taken into consideration when collecting data and training the models. There are five main activity types:

  • Building connections: Member connects or follows another member or company, or page.
  • Informational: Sharing posts, articles, or pictures
  • Profile-based activity: Activities related to the profile, such as changing the profile picture, adding a new experience, changing the profile header, etc.
  • Opinion-specific activity: Activities that are related to member opinions such as likes or comments or reposting a certain post, article, or picture.
  • Site-specific activity: Activities that are specific to LinkedIn such as endorsement and applying for jobs.

1.2. Evaluation Metrics Design

There are two main types of metrics: offline and online evaluation metrics. We use offline metrics to evaluate our model during the training and modeling phase. The next step is to move to a staging/sandbox environment to test for a small percentage of the real traffic. In this step, the online metrics are used to evaluate the impact of the model on the business metrics. If the revenue-related business metrics show a consistent improvement, it will be safe to expose the model to a larger percentage of the real traffic.

Offline Metrics

Maximizing CTR can be formalized as training a supervised binary classifier model. Therefore for the offline metrics, the normalized cross entropy can be used since it helps the model to be less sensitive to background CTR:

 

How LinkedIn Uses Machine Learning To Rank Your Feed 

 

Online Metrics

Since the online metrics should reflect the level of engagement of users when the model is deployed, we can use the conversion rate, which is the ratio of clicks per feed.

1.3. Technical Requirements

The technical requirements will be divided into two main categories: during training and during inference. The technical requirements during training are:

  • Large training set: One of the main requirements during training is to be able to handle the large training dataset. This requires distributed training settings.
  • Data shift: In social networks, it is very common to have a data distribution shift from offline training data to online data. A possible solution to this problem is to retrain the models incrementally multiple times per day.

The technical requirements during inference are:

  • Scalability: To be able to serve customized user feeds for more than 300 million users.
  • Latency: It is important to have short latency to be able to provide the users with the ranked feed in less than 250 ms. Since multiple pipelines need to pull data from numerous sources before feeding activities into the ranking models, all these steps need to be done within 200 ms. Therefore the
  • Data freshness: It is important that the models be aware of what the user had already seen, else the feeds will show repetitive content, which will decrease user engagement. Therefore the data needs to run really fast.

1.4. Technical challenges

There are four main technical challenges:

  • Scalability: One of the main technical challenges is the scalability of the system. Since the number of LinkedIn users that need to be served is extremely large, around 300 million users. Every user, on average, sees 40 activities per visit, and each user visits 10 times per month on average. Therefore we have around 120 billion observations or samples.
  • Storage: Another technical challenge is the huge data size. Assume that the click-through rate is 1% each month. Therefore the collected positive data will be about 1 billion data points, and the negative labels will be 110 billion negatives. We can assume that for every data point, there are 500 features, and for simplicity of calculation, we can assume every row of features will need 500 bytes to be stored. Therefore for one month, there will be 120 billion rows, each of 500 bytes therefore, the total size will be 60 Terabytes. Therefore we will have to only keep the data of the last six months or the last year in the data lake and archive the rest in cold storage.
  • Personalization: Another technical challenge will be personalization since you will have different users to serve with different interests so you need to make sure that the models are personalized for each user.
  • Content Quality Assessment: Since there is no perfect classifier. Therefore some of the content will fall into a gray zone where even two humans can have difficulty agreeing on whether or not it’s appropriate content to show to the users. Therefore it became important to combine man+machine solutions for content quality assessment.

2. Data Collection

Before training the machine learning classifier, we first need to collect labeled data so that the model can be trained and evaluated. Data collection is a critical step in data science projects as we need to collect representative data of the problem we are trying to solve and to be similar to what is expected to be seen when the model is put into production. In this case study, the goal is to collect a lot of data across different types of posts and content, as mentioned in subsection 1.1.

The labeled data we would like to collect, in our case, will click or not click labeled data from the user’s feeds. There are three main approaches to do collect click and no-click data:

  • Rank user’s feed chronically: The data will be collected from the user feed, which will be ranked chronically. This approach can be used to collect the data. However, it will be based on the user’s attention will be attracted to the first few feeds. Also, this approach will induce a data sparsity problem as some activities, such as job changes, rarely happen compared to other activities, so they will be underrepresented in your data.
  • Random serving: The second approach will be randomly serving the feed and collecting click and no click data. This approach is not preferred as it will lead to a bad user experience and non-representative data, and also it does not help with the data sparsity problem.
  • Use an algorithm to rank the feed: The last approach we can use is to use an algorithm to rank the user’s feed and then use permutation to randomly shuffle the top feeds. This will provides some randomness to the feed and will help to collect data from different activities.

3. Data Preprocessing & Feature Engineering

The third step will be preparing the data for the modeling step. This step includes data cleaning, data preprocessing, and feature engineering. Data cleaning will deal with missing data, outliers, and noisy text data. Data preprocessing will include standardization or normalization, handling text data, dealing with imbalanced data, and other preprocessing techniques depending on the data. Feature Engineering will include feature selection and dimensionality reduction. This step mainly depends on the data exploration step as you will gain more understanding and will have better intuition about the data and how to proceed in this step.

The features that can be extracted from the data are:

  • User profile features: These features include job title, user industry, demographic, education, previous experience, etc. These features are categorical features, so they will have to be converted into numerical as most of the models cannot handle categorical features. For higher cardinality, we can use feature embeddings, and for lower cardinality, we can use one hot encoding.
  • Connection strength features: These features represent the similarities between users. We can use embeddings for users and measure the distance between them to calculate the similarity.
  • Age of activity features: These features represent the age of each activity. This can be handled as a continuous feature or can be binned depending on the sensitivity of the click target.
  • Activity features: These features represent the type of activity. Such as hashtags, media, posts, and so on. These features will also be categorical, and also as before, they have to be converted into numerical using feature embeddings or one hot encoding depending on the level of cardinality.
  • Affinity features: These features represent the similarity between users and activities.
  • Opinion features: These features represent the user’s likes/comments on posts, articles, pictures, job changes,s and other activities.

Since the CTR is usually very small (less than 1%) it will result in an imbalanced dataset. Therefore a critical step in the data preprocessing phase is to make sure that the data is balanced. Therefore we will have to resample the data to increase the under-represented class.

However, this should be done only to the training set and not to the validation and testing set, as they should represent the data expected to be seen in production.

4. Modeling

Now the data is ready for the modeling part, it is time to select and train the model. As mentioned, this is a classification problem, with the target value in this classification problem being the click. We can use the Logistic Regression model for this classification task. Since the data is very large, then we can use distributed training using logistic regression in Spark or using the Method of Multipliers.

We can also use deep learning models in distributed settings. In which the fully connected layers will be used with the sigmoid activation function applied to the final layers.

For evaluation, we can follow two approaches the first is the conventional splitting of the data into training and validation sets. Another approach to avoid biased offline evaluation is to use replayed evaluation as the following:

  • Assume we have training data up to time point T. The validation data will start from T+1, and we will order their ranking using the trained model.
  • Then the output of the model is compared with the actual click, and the number of matched predicted clicks is calculated.

There are a lot of hyperparameters to be optimized one of them is the size of training data and the frequency of retaining the model. To keep the model updated, we can fine-tune the existing deep learning model with training data of the recent six months, for example.

5. High-Level Design

We can summarize the whole process of the feed ranking with this high-level design shown in figure 1.

Let’s see how the flow of the feed ranking process occurs, as shown in the figure below:

  • When the user visits the LinkedIn homepage, requests are sent to the Application server for feeds.
  • The Application server sends feed requests to the Feed Service.
  • Feed Service then gets the latest model from the model store and the right features from the Feature Store.
  • Feature Store: Feature store, stores the feature values. During inference, there should be low latency to access features before scoring.
  • Feed Service receives all the feeds from the ItemStore.
  • Item Store: Item store stores all activities generated by users. In addition to that, it also stores the models for different users. Since it is important to maintain a consistent user experience by providing the same feed rank method for each user. ItemStore provides the right model for the right users.
  • Feed Service will then provide the model with the features to get predictions. The feed service here represents both the retrieval and ranking service for better visualization.
  • The model will return the feeds ranked by CTR likelihood which is then returned to the application server.

 

How LinkedIn Uses Machine Learning To Rank Your Feed 

Figure 1. LinkedIn feed ranking high-level design.

To scale the feed ranking system, we can put a Load Balancer in front of the Application Servers. This will balance and distribute the load among the several application servers in the system.

 

How LinkedIn Uses Machine Learning To Rank Your Feed 

Figure 2. The scaled LinkedIn feed ranking high-level design.

6. References

  1. Strategies for Keeping the LinkedIn Feed Relevant
  2. Machine Learning Design Interview

By

Youssef Hosni is Co-Founder at Elfehres, Ph.D. Researcher – Computer vision, and Data Scientist

Sourced from KDnuggets

By Sharon Goldman

More than ever, organizations are putting their confidence – and investment – into the potential of artificial intelligence (AI) and machine learning (ML).

According to the 2022 IBM Global AI Adoption Index, 35% of companies report using AI today in their business, while an additional 42% say they are exploring AI. Meanwhile, a McKinsey survey found that 56% of respondents reported they had adopted AI in at least one function in 2021, up from 50% in 2020.

“We know that we cannot change the diagnosis, but we can help change the outcome”- Cigna C-suite Executives Discuss the Impact of AI and Digital Interactions in Transforming the Health of Their Customers 1

But can investments in AI deliver true ROI that directly impacts a company’s bottom line?

According to Domino Data Lab’s recent REVelate survey, which surveyed attendees at New York City’s Rev3 conference in May, many respondents seem to think so. Nearly half, in fact, expect double-digit growth as a result of data science. And 4 in 5 respondents (79%) said that data science, ML and AI are critical to the overall future growth of their company, with 36% calling it the single most critical factor.

Implementing AI, of course, is no easy task. Other survey data shows another side of the confidence coin. For example, recent survey data by AI engineering firm CognitiveScale finds that, although execs know that data quality and deployment are critical success factors for successful app development to drive digital transformation, more than 76% aren’t sure how to get there in their target 12–18 month window. In addition, 32% of execs say that it has taken longer than expected to get an AI system into production.

AI must be accountable

ROI from AI is possible, but it must be accurately described and personified according to a business goal, Bob Picciano, CEO of Cognitive Scale, told VentureBeat.

“If the business goal is to get more long-range prediction and increased prediction accuracy with historical data, that’s where AI can come into play,” he said. “But AI has to be accountable to drive business effectiveness – it’s not sufficient to say a ML model was 98% accurate.”

Instead, the ROI could be, for example, that in order to improve call centre effectiveness, AI-driven capabilities ensure that the average call handling time is reduced.

“That kind of ROI is what they talk about in the C-suite,” he explained. “They don’t talk about whether the model is accurate or robust or drifting.”

Shay Sabhikhi, cofounder and COO at Cognitive Scale, added that he’s not surprised by the fact that 76% of respondents reported having trouble scaling their AI efforts. “That’s exactly what we’re hearing from our enterprise clients,” he said. One problem is friction between data science teams and the rest of the organization, he explained, that doesn’t know what to do with the models that they develop.

“Those models may have potentially the best algorithms and precision recall, but sit on the shelf because they literally get thrown over to the development team that then has to scramble, trying to assemble the application together,” he said.

At this point, however, organizations have to be accountable for their investments in AI because AI is no longer a series of science experiments, Picciano pointed out. “We call it going from the lab to life,” he said. “I was at a chief data analytics officer conference and they all said, how do I scale? How do I industrialize AI?”

Is ROI the right metric for AI?

However, not everyone agrees that ROI is even the best way to measure whether AI drives value in the organization. According to Nicola Morini Bianzino, global chief technology officer, EY, thinking of artificial intelligence and the enterprise in terms of “use cases” that are then measured through ROI is the wrong way to go about AI.

“To me, AI is a set of techniques that will be deployed pretty much everywhere across the enterprise – there is not going to be an isolation of a use case with the associated ROI analysis,” he said.

Instead, he explained, organizations simply have to use AI – everywhere. “It’s almost like the cloud, where two or three years ago I had a lot of conversations with clients who asked, ‘What is the ROI? What’s the business case for me to move to the cloud?’ Now, post-pandemic, that conversation doesn’t happen anymore. Everybody just says, ‘I’ve got to do it.’”

Also, Bianzino pointed out, discussing AI and ROI depends on what you mean by “using AI.”

“Let’s say you are trying to apply some self-driving capabilities – that is, computer vision as a branch of AI,” he said. “Is that a business case? No, because you cannot implement self-driving without AI.” The same is true for a company like EY, which ingests massive amounts of data and provides advice to clients – which can’t be done without AI. “It’s something that you cannot isolate away from the process – it’s built into it,” he said.

In addition, AI, by definition, is not productive or efficient on day one. It takes time to get the data, train the models, evolve the models and scale up the models. “It’s not like one day you can say, I’m done with the AI and 100% of the value is right there – no, this is an ongoing capability that gets better in time,” he said. “There is not really an end in terms of value that can be generated.”

In a way, Bianzino said, AI is becoming part of the cost of doing business. “If you are in a business that involves data analysis, you cannot not have AI capabilities,” he explained. “Can you isolate the business case of these models? It is very difficult and I don’t think it’s necessary. To me, it’s almost like it’s a cost of the infrastructure to run your business.”

ROI of AI is hard to measure

Kjell Carlsson, head of data science strategy and evangelism at enterprise MLops provider Domino Data Lab, says that at the end of the day, what organizations want is a measure of the business impact of ROI – how much it contributed to the bottom line. But one problem is that this can be quite disconnected from how much work has gone into developing the model.

“So if you create a model which improves click-through conversion by a percentage point, you’ve just added several million dollars to the bottom line of the organization,” he said. “But you could also have created a good predictive maintenance model which helped give advance warning to a piece of machinery needing maintenance before it happens.” In that case, the dollar-value impact to the organization could be entirely different, “even though one of them might end up being a much harder problem,” he added.

Overall, organizations do need a “balanced scorecard” where they are tracking AI production. “Because if you’re not getting anything into production, then that’s probably a sign that you’ve got an issue,” he said. “On the other hand, if you are getting too much into production, that can also be a sign that there’s an issue.”

For example, the more models data science teams deploy, the more models they’re on the hook for managing and maintaining, he explained. “So [if] you deployed this many models in the last year, so you can’t actually undertake these other high-value ones that are coming your way,” he said.

But another issue in measuring the ROI of AI is that for a lot of data science projects, the outcome isn’t a model that goes into production. “If you want to do a quantitative win-loss analysis of deals in the last year, you might want to do a rigorous statistical investigation of that,” he said. “But there’s no model that would go into production, you’re using the AI for the insights you get along the way.”

Data science activities must be tracked

Still, organizations can’t measure the role of AI if data science activities aren’t tracked. “One of the problems right now is that so few data science activities are really being collected and analysed,” said Carlsson. “If you ask folks, they say they don’t really know how the model is performing, or how many projects they have, or how many CodeCommits your data scientists have made within the last week.”

One reason for that is the very disconnected tools data scientists are required to use. “This is one of the reasons why Git has become all the more popular as a repository, a single source of truth for your data scientist in an organization,” he explained. MLops tools such as Domino Data Lab offer platforms that support these different tools. “The degree to which organizations can create these more centralized platforms … is important,” he said.

AI outcomes are top of mind

Wallaroo CEO and founder Vid Jain spent close to a decade in the high-frequency trading business in Merrill Lynch, where his role, he said, was to deploy ML at scale and do so with a positive ROI.

The challenge was not actually developing the data science, cleansing the data or building the trade repositories, now called data lakes. By far, the biggest challenge was taking those models, operationalizing them and delivering the business value, he said.

“Delivering the ROI turns out to be very hard – 90% of these AI initiatives don’t generate their ROI, or they don’t generate enough ROI to be worth the investment,” he said. “But this is top of mind for everybody. And the answer is not one thing.”

A fundamental issue is that many assume that operationalizing ML is not much different than operationalizing a standard kind of application, he explained, adding that there is a big difference, because AI is not static.

“It’s almost like tending a farm, because the data is living, the data changes and you’re not done,” he said. “It’s not like you build a recommendation algorithm and then people’s behaviour of how they buy is frozen in time. People change how they buy. All of a sudden, your competitor has a promotion. They stop buying from you. They go to the competitor. You have to constantly tend to it.”

Ultimately, every organization needs to decide how they will align their culture to the end goal around implementing AI. “Then you really have to empower the people to drive this transformation, and then make the people that are critical to your existing lines of business feel like they’re going to get some value out of the AI,” he said.

Most companies are still early in that journey, he added. “I don’t think most companies are there yet, but I’ve certainly seen over the last six to nine months that there’s been a shift towards getting serious about the business outcome and the business value.”

ROI of AI remains elusive

But the question of how to measure the ROI of AI remains elusive for many organizations. “For some there are some basic things, like they can’t even get their models into production, or they can but they’re flying blind, or they are successful but now they want to scale,” Jain said. “But as far as the ROI, there is often no P&L associated with machine learning.”

Often, AI initiatives are part of a Centre of Excellence and the ROI is grabbed by the business units, he explained, while in other cases it’s simply difficult to measure.

“The problem is, is the AI part of the business? Or is it a utility? If you’re a digital native, AI might be part of the fuel the business runs on,” he said. “But in a large organization that has legacy businesses or is pivoting, how to measure ROI is a fundamental question they have to wrestle with.”

By Sharon Goldman

Sourced from VentureBeat

Data science is a new interdisciplinary field of research that focuses on extracting value from data, integrating knowledge and methods from computer science, mathematics and statistics, and an application domain. Machine learning is the field created at the intersection of computer science and statistics, and it has many applications in data science when the application domain is taken into consideration.

From a historical perspective, machine learning was considered, for the past 50 years or so, as part of artificial intelligence. It was taught mainly in computer science departments to scientists and engineers and the focus was placed, accordingly, on the mathematical and algorithmic aspects of machine learning, regardless of the application domain. Thus, although machine learning deals also with statistics, which focuses on data and does consider the application domain, up until recently, most machine learning activities took place in the context of computer science, where it began, and which focuses traditionally on algorithms.

Two processes, however, have taken place in parallel to the accelerated growth of data science in the last decade. First, machine learning, as a sub-field of data science, flourished and its implementation and use in a variety of disciplines began. As a result, researchers realized that the application domain cannot be neglected and that it should be considered in any data science problem-solving situation. For example, it is essential to know the meaning of the data in the context of the application domain to prepare the data for the training phase and to evaluate the algorithm’s performance based on the meaning of the results in the real world. Second, a variety of population began taking machine learning courses, people for whom, as experts in their disciplines, it is inherent and essential to consider the application domain in data science problem-solving processes.

Teaching machine learning to such a vast population, while neglecting the application domain as it is taught traditionally in computer science departments, is misleading. Such a teaching approach guides learners to ignore the application domain even when it is relevant for the modelling phase of data science, in which machine learning is largely used. In other words, when students learn machine learning without considering the application domain, they may get the impression that machine learning should be applied this way and become accustomed to ignoring the application domain. This habit of mind may, in turn, influence their future professional decision-making processes.

For example, consider a researcher in the discipline of social work who took a machine learning course but was not educated to consider the application domain in the interpretation of the data analysis. The researcher is now asked to recommend an intervention program. Since the researcher was not educated to consider the application domain, he or she may ignore crucial factors in this examination and rely only on the recommendation of the machine learning algorithm.

Other examples are education and transportation, fields that everyone feels they understand. As a result of a machine learning education that does not consider the application domain, non-experts in these fields may assume that they have enough knowledge in these fields, and may not understand the crucial role that professional knowledge in these fields plays in decision-making processes that are based on the examination of the output of machine learning algorithms. This phenomenon is further highlighted when medical doctors or food engineers, for example, are not trained or educated in machine learning courses to criticize the results of machine learning algorithms based on their professionalism in medicine and food engineering, respectively.

We therefore propose to stop teaching machine learning courses to populations whose core discipline is neither computer science nor mathematics and statistics. Instead, these populations should learn machine learning only in the context of data science, which repeatedly highlights the relevance of the application domain in each stage of the data science lifecycle and, specifically, in the modelling phase in which machine learning plays an important role.

If our suggestion, to offer machine learning courses in a variety of disciplines only in the context of data science, is accepted, not only will the interdisciplinarity of data science be highlighted, but the realization that the application domain cannot be neglected in data science problem-solving processes will also be further illuminated.

Don’t teach machine learning! Teach data science!

Orit Hazzan is a professor in the Technion’s Department of Education in Science and Technology; her research focuses on computer science, software engineering, and data science education. Koby Mike is a Ph.D. student at the Technion’s Department of Education in Science and Technology; his research focuses on data science education.

Sourced from Communications of the ACM

By Paul Towler

It can be tricky to follow the latest “tools of the trade” regarding online marketing strategies. The importance of local SEO, the rise of machine learning and customised content all represent trending topics.

However, what about the so-called “metaverse”? Might this realm represent the next leap forward with effective advertising? Before we look at five unique opportunities within this field, it is a good idea to take a closer look at exactly what the metaverse is.

What Exactly is the Metaverse?


Perhaps the simplest way to define the metaverse involves the concept of a three-dimensional social media community. This type of virtual reality allows members to interact with one another. However, this is also a much broader concept.

The metaverse essentially represents the numerous ways in which users communicate with one another across the digital community. This can include online role-playing games, smartphone applications and even the ability to buy and sell goods online.

We can now see that there is more than one way to define the metaverse. Perhaps this concept essentially involves more efficient ways of connecting with users. So, it makes perfect sense that marketers have become interested in what it can offer? Let’s now look at some potential opportunities for success. 

  1. A Much Broader Reach


    As this article rightfully observes, the metaverse is more of a concept than a reality at the moment. However, marketers can still take advantage of its potential. Similar to social media channels, campaigns within this digital “ether” are thought to be capable of reaching a much wider audience; particularly millennials. As this world becomes ever more interconnected, it should be possible to employ a single advertising campaign to reach a wide range of consumers. 

  2. The Immersive Nature of the Metaverse


    One of the pitfalls that marketing experts are likely to experience from time to time involves keeping the attention of a fickle audience. After all, the online community is laden with advertisements. This has caused many consumers to simply ignore these campaigns entirely; even if they happen to be offering truly unique products or services.

    Thankfully, things may soon be able to change thanks to the presence of the metaverse. We need to remember that this environment will provide a much more immersive means to communicate with others. Therefore, it may be possible to create advertising campaigns that allow users to interact with certain elements. Here are some interesting possibilities:

  • Changing the colour of an object.
  • Obtaining 360-degree views of what is being offered.
  • Clicking on specific portions of an advertisement to be taken to different pages of an embedded website.

The bottom line is that keeping the attention of potential clients is one of the best ways to ensure a conversion. 

  1. A New Type of Social Media Influencer


    Avatars are also predicted to play an important role within the metaverse. This is due to the inherently social nature of such a digital society. So, individuals will likely be given the opportunities to create their avatars when communicating with others. Why should this be any different for businesses?

    We once again return to the decidedly personal side of marketing. Individuals do not wish to be thought of as consumers, but rather as people who share common goals and interests. Developing a branded avatar will provide businesses with a much more targeted (and even organic) means to promote what it is that they have to offer.

    From promoting virtual fashion exhibitions and advertising digital dancehalls to providing exclusive deals to other virtual “friends”, the possibilities are nearly limitless. 

  2. Taking the Notion of Online Sales to the Next Level


    In the past, digital sales were somewhat limited by the technology at their disposal. While it was possible to examine products in minute detail and to select certain options during the buying process, this was hardly the same as visiting a physical retail outlet.

    Once again, the metaverse is expected to rise to the occasion. As some facets of this world are set to be rooted within the world of augmented reality, the ability to truly interact with what is being offered could present some amazing marketing opportunities.

    Might it soon be possible to take a car for a digital “test drive” before committing to a purchase? Could users eventually be able to try on an item of clothing or virtually tour a home before it is even built? These are some of how the metaverse will change the entire notion of marketing. 

  3. What About “Meta Products?”


    This final concept is slightly strange and yet, it is also slated to have an impact on digital marketing. There could very well be options to create lines of virtual products to augment existing revenue streams. In fact, these strategies are already present. Examples of virtual goods include:

  • Avatars for a character within an MMORPG.
  • Virtual currencies.
  • Paid access to online events.
  • E-books and online distance learning courses.


When applied to the world of marketing, the value of meta products (such as free samples or discounts for the first hundred virtual shoppers) becomes very clear.

These are five opportunities for marketers who wish to take full advantage of what the metaverse is expected to become. Although the entire concept may appear a bit odd at first glance, many felt the same about social media during the early 2000s. Once again, it pays to think a few steps ahead of the competition.

By Paul Towler

Paul Towler is the technical operations director at SmartOffice, a software automation provider who helps companies with their document management systems.

By

Could the tech giants take control of the AI narrative and reduce choices for enterprises? Experts weighed the pros and cons in a recent online conference.

Artificial intelligence and machine learning requires huge amounts of processing capacity and data storage, making the cloud the preferred option. That raises the specter of a few cloud giants dominating AI applications and platforms. Could the tech giants take control of the AI narrative and reduce choices for enterprises?

Not necessarily, but with some caveats, AI experts emphasize. But the large cloud providers are definitely in a position to control the AI narrative from several perspectives.

That’s part of the consensus raised at a recent webcast hosted by New York University Center for the Future of Management and LMU institute for Strategy, Technology and Organization, joined by Daron Acemoglu, professor at MIT; Jacques Bughin, professor at the Solvay School of Economics and Management; and Raffaella Sadun, professor at Harvard Business School.

There’s more to AI than cloud. The complexity and diversity of AI applications go well beyond the cloud environments where they are run — and therefore reduce the dominance of a few cloud giants.

Certainly, “AI will require more capacity in storage, of the information flow,” says Bughin. At the same time, “cloud is only one part of the total pie of the platform. It’s part of infrastructure, but the platform layer is what you develop in house and through a third party. This integration is going to be hybrid, even more important than the cloud itself. Let’s be very clear, it’s not about operation, it’s a lot of algorithms, it’s a lot of different data, that integration piece, that will require system integration, architecture and design. That means that different types of firms will be involved in that work.”

What Bughin worries about more is the innovation potential from AI startups that may be squashed by larger players gobbling up smaller companies and startups through mergers and acquisitions. “Companies like the big internet or AI guys are going and buying a lot of very small and very clever AI firms.”

At the same time, Sadun points out that smaller companies may be in a better position to leverage AI innovations — but need help with training and education to prepare them. “This issue of who benefits from AI is really important,” she says. “On the one hand, we might think the smaller firms may be able to use these technologies more effectively, because they are more nimble, more agile. Companies that have already digital can exploit and scale AI.”

Where the large cloud providers may also make their dominance felt is in the monopolization of the data that feeds AI systems, says Acemoglu. Cloud architecture itself can be based on price-sensitive and competitive cloud services, he explains. “But the cloud architecture will not enable you to exploit data. The area, where I worry about the future of AI technologies are those that enable firms to monopolize data. That’s where firms have an oversized effect on the future direction of technology. That means a few people in a boardroom are going to determine where a technology’s going to go. We want more people focused and people-centric AI. That’s not going to be possible if a few firms that have a different business model dominate the future of technology. ”

The value of an AI-driven enterprise “does not reside in the cloud that enables it,” Bughin believes. “I think there’s enough of competition for the price point not to destroy the value. The value will come from the fact that you have integrated these technologies where you work, and the way your company works, in your own back end. The back end is not going to be the battlefield. The value is from generating productivity and revenue, at a rate faster than what we’ve seen in traditional digital transformations.”

And, for the first time, we see the terms traditional and digital transformation used together in the same sentence. As these thought leaders relate, such transformations are moving to the next phase, enabling autonomous, software-driven operations and innovation through AI. It’s a question of whether large tech vendors control the momentum, or if it remains a market and practice with a diversity of choices. Stay tuned.

Feature Image Credit: Joe McKendrick

By

Sourced from ZDNet

By Anil Gupta

Artificial intelligence and machine learning are among the top marketing buzzwords we come across in the field of digital marketing. These technologies have already become an integral part of digital marketing and are being leveraged to make campaigns more personable and efficient.

For instance, artificial intelligence can make personalization easy and quick by creating accurate buyer personas. These personas are auto-generated to deliver a holistic audience segmentation, thereby improving the effectiveness of the campaigns. In addition, Netflix, Google, Uber, Spotify, Pinterest, and other apps use machine learning to personalize individual accounts and make relevant recommendations to their users.

The ever-improving algorithms and the exponential growth of data are encouraging business leaders and marketers to use AI, in the form of machine learning, natural language processing (NLP), deep learning, and other technologies. These technologies are helping them improve customer experience and conversions.

A Gartner survey shows that 37% of organizations are applying AI in some form or the other to boost their digital performance.

This post highlights how AI and ML are proving to be game-changers in the digital marketing realm.

1. Offer a Better Understanding of the Audience

Great content starts with knowing the audience well. When a business knows its target audience, the connection feels more natural and relevant. That genuine connection goes a long way in building lasting relationships with customers.

In recent years, AI and ML have opened up a whole new world of possibilities for understanding audience behavior. AI tools and data-driven insights are helping businesses understand who they are reaching, what the customers want and need, when to communicate, and where to reach them.

Artificial intelligence helps marketers instantly define buyer personas. Then, platforms like Socialbakers auto-generate these personas to deliver more holistic audience segmentation in the form of actionable insights. These insights help content marketers share inspiring stories that convert.

Keeping your audience at the centre of your online strategies is critical to business success. AI can help by offering unique audience insights, enabling businesses to deliver an integrated brand experience through relevant content. It also helps in selecting the most trustworthy and effective influencers for the brand.

2. Help with Lead Management

Big data, predictive analytics, and machine learning are being increasingly used in business intelligence these days. Machine learning, with its ability to bring out valuable hidden insights from large data sets, can create tangible value for businesses.

Leads are the driving force for businesses. They are the ones who will soon contribute to the organizational revenue. Hence, business leaders spend a significant amount of time in lead management. ML can be leveraged to improve and scale a firm’s approach to lead management, thereby boosting the bottom line. It helps firms generate better leads, qualify and nurture them, and ultimately monetize them effectively.

For instance, ML can help you create an ideal customer profile (ICP) to reach the best customers. ICP takes a structured look at the demographics and psychographics of an individual and determines their purchase intent and the content that matters to them. Thus, ICP can be used for lead scoring, allowing marketers to prioritize targeted accounts.

ML can also help firms generate more qualified leads from the traffic already coming to the site. For example, check out how Drift, a revenue acceleration platform, uses conversational AI to recognize quality from noise, learn from the conversations, and automatically qualify or disqualify website visitors. These qualifiers help the sales team focus on leads that are ready for conversions.

3. Curate and Create Better Content

AI is changing the game for content marketers. The technology is being used to automatically generate content for simple stories like sports news or stock market updates. AI also allows social channels to customize user new feeds.

But one content field where AI is increasingly applied is content curation. AI algorithms make it easier to collect target audience data to create relevant content at each stage of the marketing funnel.

For instance, the algorithms collect data on what the audience prefers to read, the questions they want answers to, or any specific concerns. Using this data, content marketers can curate and create relevant content that boosts customer experience and ultimately leads to conversions.

The North Face uses an AI-powered technology like IBM Watson that recreates shopping experiences. The AI tool uses cognitive computing that brings the online and in-store experiences closer together.

Besides, machine learning feeds content strategies by discovering fresh research-based content ideas, identifying the top-performing topic clusters, showing the most relevant keywords in a specific niche.

For instance, Google Analytics and SEMrush operate on machine-learning algorithms that are useful in keyword research and discovery, and content distribution. In addition, these tools can discover industry trends and show you ways to rank higher in SERP.

AI and ML-enabled tools improve the overall reception and performance of online content. In addition, the tools allow marketers to offer relevant and personalized digital experiences that positively influence engagement.

4. Help with Competitive Search Engine Ranking

Search engines are already using AI-enabled algorithms to deliver the most relevant SERP results. These algorithms rely on AI to understand the context of the content and spot irrelevant keywords. No wonder SEOs are constantly striving to understand these algorithms and coming up with strategies to create contextual, conceptual, and accurate content.

The placement of your business in the SERPs can make or break your online reputation and performance. AI technologies make it easier to create compelling content that answers the target audience’s queries, keywords, and phrases.

SEO isn’t a day’s job. It’s challenging, and the results of one’s efforts can only be seen after months. Fortunately, AI-based SEO tools help alleviate this stress. SEO optimization tools like Moz, WooRank, BrightEdge, and MarketMuse heavily rely on AI to offer SEO solutions like:

  • Keyword research
  • Search terms to make the content more relevant
  • Link-building opportunities
  • Trending topics
  • Optimum content length
  • User intent and more.

Tools like Alli AI can instantly optimize your website regardless of the CMS and your web development expertise. The platform performs a site-wide content and SEO audit, automatically optimizes the content, and resolves duplicate content issues. All this makes it easier for content creators to avoid poor-performing content and boost their online ranking.

5. Improve Page Speed

Google has put an exact value on fast user experience by including page speed as one of its ranking signals. That’s why boosting page speed is one of the top priorities for all businesses, especially ecommerce firms. As a result, Webmasters take all sorts of measures to improve page speed.

For instance, WordPress site owners may speed up WordPress by optimizing background processes, keeping the WP site updated, using a content delivery network (CDN), or using faster plugins. Of course, they also use various tools like Page Speed Insights, load time testers, and CMS plugins for the purpose. But now, there’s another ML-powered solution available for boosting the page speed – the Page Forecasting Model.

This model predicts user behaviour using machine learning and predicts the next page visitors will click on in real-time. This allows Webmasters to preload the page in the background, thus improving the overall experience.

The algorithm is trained with historical data from Google Analytics.

For instance, user patterns like going from home page to category page or product page to the shopping cart are recognized, understood, and included in update algorithms. If the user behaves similarly, the algorithm is automatically prepared with the next page.

However, the prediction accuracy is dependent on the amount of data available to train the algorithm and the website structure. So, the models will vary according to these factors. For instance, if yours is an ecommerce website that combines industry news with product pages, it’s better to use two or more models that can predict the behaviour per section.

6. Automate Website Analytics Process

Web analytics isn’t new. Businesses have been assessing user behaviour and tracking key performance metrics since the mid-’90s. But thanks to AI and machine learning, web analytics tools now have robust capabilities that allow businesses to automate the process. These tools can offer auto-generated reports and on-demand insights that feed marketing strategies.

Within a single visit to a webpage, each user generates hundreds of data points like the time spent on a page, the browser details, its location, and others. It is practically impossible to analyse all this data manually. AI and ML make such analysis faster and accurate by speeding up the data processing.

AI-based tools can help you track each visitor’s online behaviour, understand user journeys, and how customers move through the marketing funnel. They also point out issues, if any.

Let’s say you have a blog post that gets a lot of traffic, but visitors just read the post and leave without taking action like subscribing to your newsletter or sharing your post on social media. AI-based tools can flag such issues, allowing you to take the necessary corrective action like adding internal links or improving your CTA.

Google Analytics (insights section), Adobe Analytics, and Kissmetrics are among the top web analytics tools that help firms see patterns in customer behaviour and predict future trends.

7. Improve Site Navigation

Site navigation is another critical area in digital performance where AI and ML can help is site navigation. Though it may sound negligible, the importance of having organized and easy-to-follow navigation cannot be ignored. Well-planned navigation improves the visit duration, reduces the bounce rate, and boosts user experience. It also enhances the overall aesthetic appeal of the website design.

AI can help Webmasters create a user-friendly website structure that’s easy to navigate. AI-powered chatbots can guide users through the pages and help them find what they are looking for within the first few clicks. This significantly improves the user experience and sends good signals to search engines, indicating that your content is useful and relevant.

Thus, Google and other search engines will rank your page higher than any other website offering similar content.

8. Design Better Websites

AI applications can improve the usability and experience of a website by enhancing the site’s appearance, strengthening its search abilities, managing inventory better, and improving interaction with website visitors. No wonder a growing number of designers and developers are moving towards AI-based design practices.

AI is slowly becoming an indispensable part of modern web design and development. Take the field of artificial design intelligence (ADI) systems, for instance. ADI has triggered a sudden shift in the way web designing is done. It allows designers to combine applications into the website for better user experience and functionality.

Check out The Grid website platform that automatically adapts its design to highlight the content. The platform uses ML and constraint-based design and flow-based programming to dynamically adapt the website design to the content.

Today, we have several entrants in this space that are taking AI in web design to a whole new level. Brands like Adobe, Firedrop, Bookmark, Wix, Tailor Brands, and many others are leading the segment and leveraging the capabilities of AI in web design. In addition, most of these ADI platforms can learn and offer suggestions for optimizing the website for better user experience and SEO performance.

The Way Forward

Artificial intelligence and machine learning are proving to be awesome technologies when it comes to improving a firm’s digital performance. However, it is essential to remember that these ML models are only as good as the data that’s used to train them. Therefore, it’s critical to ensure that your marketing team has access to high-quality and accurate data.

So, before applying these technologies to your digital efforts, there are specific steps that you need to take.

  • Set up tags to track and capture on-site user behaviour.
  • House all the data from different sources in one central place like Google BigQuery, a Big Data analytics platform.
  • Invest in data deduplication to eliminate duplicate copies of repeating data from multiple sources.

Once your data is in place, you will be in a great position to start deploying AI and ML for boosting your digital performance. In addition, the information shared above will prove to be useful as you start building machine learning solutions for improving your business’s online presence.

By Anil Gupta

Anil is the CEO & Co-Founder of Multidots, one of the top WordPress development agencies on the planet. He is a technopreneur with over 13 years of experience coding, thinking, and leading the business with mind and people with heart. He and his team are seasoned in delivering secure and feature-reach WordPress services for businesses big and small.

Sourced from readwrite