Tag

technology

Browsing

By

China announced in 2017 its ambition to become the world leader in artificial intelligence (AI) by 2030. While the US still leads in absolute terms, China appears to be making more rapid progress than either the US or the EU, and central and local government spending on AI in China is estimated to be in the tens of billions of dollars.

The move has led – at least in the West – to warnings of a global AI arms race and concerns about the growing reach of China’s authoritarian surveillance state. But treating China as a “villain” in this way is both overly simplistic and potentially costly. While there are undoubtedly aspects of the Chinese government’s approach to AI that are highly concerning and rightly should be condemned, it’s important that this does not cloud all analysis of China’s AI innovation.

The world needs to engage seriously with China’s AI development and take a closer look at what’s really going on. The story is complex and it’s important to highlight where China is making promising advances in useful AI applications and to challenge common misconceptions, as well as to caution against problematic uses.

Nesta has explored the broad spectrum of AI activity in China – the good, the bad and the unexpected.

The good

China’s approach to AI development and implementation is fast-paced and pragmatic, oriented towards finding applications which can help solve real-world problems. Rapid progress is being made in the field of healthcare, for example, as China grapples with providing easy access to affordable and high-quality services for its ageing population.

Applications include “AI doctor” chatbots, which help to connect communities in remote areas with experienced consultants via telemedicine; machine learning to speed up pharmaceutical research; and the use of deep learning for medical image processing, which can help with the early detection of cancer and other diseases.

Since the outbreak of COVID-19, medical AI applications have surged as Chinese researchers and tech companies have rushed to try and combat the virus by speeding up screening, diagnosis and new drug development. AI tools used in Wuhan, China, to tackle COVID-19 – by helping accelerate CT scan diagnosis – are now being used in Italy and have been also offered to the NHS in the UK.

The bad

But there are also elements of China’s use of AI which are seriously concerning. Positive advances in practical AI applications which are benefiting citizens and society don’t detract from the fact that China’s authoritarian government is also using AI and citizens’ data in ways that violate privacy and civil liberties.

Most disturbingly, reports and leaked documents have revealed the government’s use of facial recognition technologies to enable the surveillance and detention of Muslim ethnic minorities in China’s Xinjiang province.

The emergence of opaque social governance systems which lack accountability mechanisms are also a cause for concern.

In Shanghai’s “smart court” system, for example, AI-generated assessments are used to help with sentencing decisions. But it is difficult for defendants to assess the tool’s potential biases, the quality of the data and the soundness of the algorithm, making it hard for them to challenge the decisions made.

China’s experience reminds us of the need for transparency and accountability when it comes to AI in public services. Systems must be designed and implemented in ways that are inclusive and protect citizens’ digital rights.

The unexpected

Commentators have often interpreted the State Council’s 2017 Artificial Intelligence Development Plan as an indication that China’s AI mobilisation is a top-down, centrally planned strategy.

But a closer look at the dynamics of China’s AI development reveals the importance of local government in implementing innovation policy. Municipal and provincial governments across China are establishing cross-sector partnerships with research institutions and tech companies to create local AI innovation ecosystems and drive rapid research and development.

Beyond the thriving major cities of Beijing, Shanghai and Shenzhen, efforts to develop successful innovation hubs are also underway in other regions. A promising example is the city of Hangzhou, in Zhejiang Province, which has established an “AI Town”, clustering together the tech company Alibaba, Zhejiang University and local businesses to work collaboratively on AI development. China’s local ecosystem approach could offer interesting insights to policymakers in the UK aiming to boost research and innovation outside the capital and tackle longstanding regional economic imbalances.

China’s accelerating AI innovation deserves the world’s full attention, but it is unhelpful to reduce all the many developments into a simplistic narrative about China as a threat or a villain. Observers outside China need to engage seriously with the debate and make more of an effort to understand – and learn from – the nuances of what’s really happening.

By

Sourced from The Conversation

By Bernard Marr

We may not be living on Mars or traveling to work using jet packs, but there’s no doubt the coming decade will bring many exciting technological advances. In this article, I want to outline the 25 key technology trends that I believe will shape the 2020s.

1.     Artificial intelligence (AI) and machine learning. The increasing ability of machines to learn and act intelligently will absolutely transform our world. It is also the driving force behind many of the other trends on this list.

2.     The Internet of Things (IoT). This refers to the ever-growing number of “smart” devices and objects that are connected to the internet. Such devices are constantly gathering and transmitting data, further fueling the growth in Big Data and AI.

Today In: Enterprise Tech

3.     Wearables and augmented humans. What started with fitness trackers has now exploded into a whole industry of wearable technology designed to improve human performance and help us live healthier, safer, more efficient lives. In the future, we may even see humans merge with technology to create “augmented humans” or “transhumans.”

4.     Big Data and augmented analytics. Big Data refers to the exponential growth in the amount of data being created in our world. Thanks to augmented analytics (highly advanced data analytics, often fueled by AI techniques), we can now make sense of and work with enormously complex and varied streams of data.

5.     Intelligent spaces and smart places. Closely linked to the IoT, this trend is seeing physical spaces – like homes, offices, and even whole cities – becoming increasingly connected and smart.

6.     Blockchains and distributed ledgers. This super-secure method of storing, authenticating, and protecting data could revolutionize many aspects of business – particularly when it comes to facilitating trusted transactions.

7.     Cloud and edge computing. Cloud computing – where data is stored on other computers and accessed via the internet – has helped to open up data and analytics to the masses. Edge computing – where data is processed on smart devices (like phones) – will take this to the next level.

8.     Digitally extended realities. Encompassing virtual reality, augmented reality, and mixed reality, this trend highlights the move towards creating more immersive digital experiences.

9.     Digital twins. A digital twin is a digital copy of an actual physical object, product, process, or ecosystem. This innovative technology allows us to try out alterations and adjustments that would be too expensive or risky to try out on the real physical object.

10. Natural language processing. This technology, which allows machines to understand human language, has dramatically changed how humans interact with machines, in particular giving rise to…

11. Voice interfaces and chatbots. Alexa, Siri, chatbots – many of us are now quite used to communicate with machines by simply speaking or typing our request. In the future, more and more businesses will choose to interact with their customers via voice interfaces and chatbots.

12. Computer vision and facial recognition. Machines can talk, so why shouldn’t they “see” as well? This technology allows machines to visually interpret the world around them, with facial recognition being a prime example. Although we will no doubt see greater regulatory control over the use of facial recognition, this technology isn’t going anywhere.

13. Robots and cobots. Today’s robots are more intelligent than ever, learning to respond to their environment and perform tasks without human intervention. In certain industries, the future of work is likely to involve humans working seamlessly with robot colleagues – hence the term “cobot,” or “collaborative robot.”

14. Autonomous vehicles. The 2020s will be the decade in which autonomous vehicles of all kinds – cars, taxis, trucks, and even ships – become truly autonomous and commercially viable.

15. 5G. The fifth generation of cellular network technology will give us faster, smarter, more stable wireless networking, thereby driving advances in many other trends (e.g., more connected devices and richer streams of data).

16. Genomics and gene editing. Advances in computing and analytics have driven incredible leaps in our understanding of the human genome. Now, we’re progressing to altering the genetic structure of living organisms (for example, “correcting” DNA mutations that can lead to cancer).

17. Machine co-creativity and augmented design. Thanks to AI, machines can do many things – including creating artwork and designs. As a result, we can expect creative and design processes to shift towards greater collaboration with machines.

18. Digital platforms. Facebook, Uber, and Airbnb are all household-name examples of digital platforms – networks that facilitate connections and exchanges between people. This trend is turning established business models on their head, leading many traditional businesses to transition to or incorporate a platform-based model.

19. Drones and unmanned aerial vehicles. These aircraft, which are piloted either remotely or autonomously, have changed the face of military operations. But the impact doesn’t stop there – search and rescue missions, firefighting, law enforcement, and transportation will all be transformed by drone technology. Get ready for passenger drones (drone taxis), too!

20. Cybersecurity and resilience. As businesses face unprecedented new threats, the ability to avoid and mitigate cybersecurity threats will be critical to success over the next decade.

21. Quantum computing. Quantum computers – unimaginably fast computers capable of solving seemingly unsolvable problems – will make our current state-of-the-art technology look like something out of the Stone Age. As yet, work in quantum computing is largely restricted to labs, but we could see the first commercially available quantum computer this decade.

22. Robotic process automation. This technology is used to automate structured and repetitive business processes, freeing up human workers to concentrate on more complex, value-adding work. This is part of a wider shift towards automation that will impact every industry.

23. Mass personalization and micro-moments. Mass-personalization is, as you might expect, the ability to offer highly personalized products or services on a mass scale. Meanwhile, the term “micro-moments” essentially means responding to customer needs at the exact right moment. Both are made possible by technologies like AI, Big Data, and analytics.

24. 3D and 4D printing and additive manufacturing. Although this may seem low-tech compared to some of the other trends, 3D and 4D printing will have very wide applications – and will be particularly transformative when combined with trends like mass-personalization.

25. Nanotechnology and materials science. Our increasing ability to understand materials and control matter on a tiny scale is giving rise to exciting new materials and products, such as bendable displays.

Read more about these 25 key technology trends – including practical examples from a wide range of industries – in my new book, Tech Trends in Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution.

 

Feature Image Credit: Adobe Stock

By Bernard Marr

Sourced from Forbes

By Nick Hourigan.

Financial crime is growing ever more sophisticated, but so are the techniques used to fight and detect it. What are the developments in the crimefighting industry?

Financial crime isn’t isolating. Criminals — money launderers, terrorists, fraudsters — evolve apace or ahead of the institutions and individuals they exploit. Global headlines, encompassing Panama to Moscow and due South, beg the question: “Why can’t we keep up?”

How data experts can help combat financial crime

Nick Hourigan, Senior Managing Director and Head of Data and Analytics at FTI Consulting, is one of the people tasked with fighting this growing problem.

He says that, at the point a regulator is asking really probing questions of a financial institution, “harm has probably already occurred, most likely over the course of years.”

“If there is crime, it’s usually in layers and the extent is often broader than even the regulator’s influence.” While Hourigan always advises his clients to seek objective legal and strategic advice, he emphasises time and attention from senior leadership, compliance and systems owners is vital, but challenging to synchronise.

“These are hard problems that can have major reputational and financial impacts for organisations — but that isn’t always obvious at the outset.” Grappling with vast data is essential to get to the truth. “I tell my teams,” he says, “you make the computer work for you; because you can bet the criminal did.”

The benefits of ubiquitous data

So, what challenges can be met using advanced analytics and process automation? “The response must draw upon a complete picture of events and parties,” says Hourigan. “One of the key benefits of advanced analytics and process automation is the ability to combine, analyse and review information sourced from very different business systems and processes. Historically, you may not have been able to identify that someone’s activity didn’t match up with how they present themselves, because the inconsistencies are scattered across different systems.”

FTI itself aims to combine advanced analytics, e-discovery services and financial crime expertise to fight the growing problem.

“FTI does not have many people who try to be all things,” explains Hourigan. “We have technical and topical experts engaged on a project and we are able to put together teams in which the decision-making process is led by some of the most experienced global experts in the fields of financial services, regulatory investigations and data and analytics.”

Tailored financial crime solutions for large and small businesses

The critical areas of development of advanced analytics and process automation to combat financial crime depend on the size and profile of the institution, says Hourigan.

“Larger institutions will likely need to leverage data from across a greater range of geographies and services, each supported by many-layered operations. Smaller and ‘challenger’ institutions may have a simpler data and systems landscape but less established operational and compliance support. Across the spectrum, there are powerful techniques such as network analytics that can be tailored to compliment traditional investigation techniques and provide a comprehensive, sophisticated and robust response to financial crime.”

In essence, to continue to meet and prevent the threat of financial crime, financial institutions need to rapidly combine, evaluate and make decisions on any data that may indicate criminal activity. Across all those incumbent, advanced analytics and process automation are essential tools to get the right data to the right person at the right time.

By Nick Hourigan

Senior Managing Director, Head of Data and Analytics, FTI Consulting

Sourced from CITY A.M.

Sourced from DIGIDAY

 

For brand marketers, the still-new year brings new opportunities and new priorities alike. Changes in platforms and measurement are keeping some execs on their toes, while others remain restless over ROI and attribution.

At the Digiday Brand Summit in Scottsdale, Arizona, we asked five prominent marketers about what’s keeping them up at night. Their concerns ranged, but one thing remained consistent: these marketers are ready to tackle these worries head-on.

Here are some highlights:

  • Rachel Finley, content & community strategist at Hero Cosmetics, is kept awake by concerns about scale, particularly with influencer marketing. It’s one thing to maintain personal relationships and a level of intimacy with a set number of people — but what happens when that number goes from 100 to 1,000?
  • Antonia Hock, global head of the Ritz-Carlton Leadership Center, wants to see a world in which business leaders better understand their power. She discusses people at brands and companies that simply punch in and punch out, and says that’s a waste of human energy.
  • Ben Conniff, co-founder and CMO of Luke’s Lobster, names communication as his chief concern. He says he wants to ensure that all of his teams are moving toward the same goal — a constant challenge for a business that’s growing in multiple verticals.
  • David Zane, managing director, marketing, NASCAR, stays up thinking about emerging platforms. He says the goalposts are always moving, and so it’s up to marketers to stay ahead and determine what constitutes success.
  • Erica Chan, strategy and operations, North America B2B at Alibaba Group, says her team operates like a startup: that means worrying about prioritization and ROI. Being able to attribute cause and effect is difficult — and figuring out how to make that case compellingly is what keeps her up at night.

Sourced from DIGIDAY

By

Hard as it may be to believe, it’s that time of year again – and no, I’m not talking about making Christmas lists, planning how best to avoid the in-laws over the festive season, and having mild panic attacks about how you’re going to afford all the presents and festivities that the coming months have in store. No, it’s the end of another year, which means it’s time to speculate about content trends for the coming 12 months.

Here are four trends that I think will have a significant impact on SEO content in 2020:

It’s not about length – it’s what you do with it that counts

As digital marketers, we sometimes get a little obsessed with hard and fast rules. It’s inevitable. We work in an industry based on understanding and algorithms, on following best practices, using fool-proof formulae, getting the inputs just right to achieve a precise result. A lot of the time, I think that’s what makes what we do rewarding. But I think one of the mistakes we make is to look for a right answer when there isn’t one.

The question of how long a piece of content should be is divisive because there really is no right answer. Actually, it’s worse than that. There are a lot of right answers. People have short attention spans, so writing concise, 500-word blogs is the way to go, right? But if you look at the top result for just about any search, you’ll find the word count rarely dips below a thousand. So longer must be better, then. Well, you can’t argue with the fact that most readers only get about halfway through a piece of content, and that many don’t even scroll to begin with. The reality is that there’s no ideal length for content, because length in itself doesn’t mean anything. What does matter is how well you’re answering the question, or addressing the needs of your reader.

In my experience, it’s safer to lean towards the longer side. There’s nothing more frustrating than seeing a great-sounding blog title, and opening the link to find 200 words of half-baked, keyword-stuffed content that doesn’t really say anything at all. It’s equally painful, though, when you start reading a long-form article and realise the writer is trying to draw out a 300 word idea into 3,000. Ultimately, longer content is good, but there are certainly diminishing returns.

Voice search will make you question everything

‘Always read your writing aloud.’ That might be the single best piece of advice I’ve ever heard as a writer. And, since voice search is expected to account for as much as half of all online search traffic by 2020, it takes on a new meaning: if you aren’t reading your own writing out loud, Google’s going to do it for you, and you’d better make sure the results are good enough to drive interaction or conversion.

The key thing to realise here is that voice search is fundamentally different from text search. The average text search phrase, for example, is around one to three words, while the average voice search phrase hovers more around three to six words. Voice searches are also far more likely to be phrased as questions. People talk to their voice assistants like they’re talking to a real person, so it follows that content should respond in kind if it hopes to meet the needs of the searcher.

For content to soak up the lion’s share of voice searches, it needs to be written more conversationally than you might be used to, and it needs to hone in on answering the questions that the user is asking. Content that answers questions head-on, shows a clear understanding of search intent and sheds as much of the unnecessary detail as possible is bound to perform better for voice search traffic, so expect this trend to become increasingly prevalent in the coming months and years.

Zero is greater than one

Another consequence of voice assistants becoming the go-to search channel is the importance of Position Zero: whenever a user inputs a voice search query, their assistant will read out the position zero result before delivering the rest. So, even if you’re dominating the search results for the entire first page, a competitor with the zero spot is going to soak up 100% of the voice search traffic and leave your hard-fought position one content starved for clicks.

Gartner estimates that around a third of searches will be done without a screen at all in 2020, which means that anything beyond the position zero result might as well not exist for voice search purposes. Expect blogging content and other written forms to include an increasing amount of structured data, rich data snippets, and content specifically designed to rank above position 1. This will be particularly important for content with a local element (since a large part of voice search queries centre around local search) and bottom-of-the-funnel searches.

This time, it’s personal

There’s no doubt that personalised marketing messaging works. We live in the age of the individual consumer: people are accustomed to their social media feeds, email inboxes and mobile experiences being tailored to their preferences and interests. So, it follows that expectations are the same for any content they engage with while searching or browsing.

For advertising the remedy is rather simple: serve ads that are targeted at specific factors and show an awareness of the individual customer’s context, preferences and their position in the sales funnel. But for ‘raw’ SEO content – that is, blogs, website copy, landing pages, etc. – it’s a little less straightforward. Depending on how deep down the rabbit hole you want to go, you could include forms, quizzes and surveys to understand exactly who you’re talking to before serving them tailored content, or you could go the simpler route and profile your user base into different personas who are likely to respond to different messaging.

Expect increasingly tailored, topic-focused content to come to the fore even more so than it already has in recent years. Again, customers are increasingly engaging with content that makes real conversation with them and demonstrates an understanding of their context, preferences and what they’re looking for. The more granular you can get when it comes to understanding those factors, the better you’ll resonate with your readers.

By

Sourced from The Drum

By Cassie Kozyrkov

Understanding the value of two completely different professions

Statistics and analytics are two branches of data science that share many of their early heroes, so the occasional beer is still dedicated to lively debate about where to draw the boundary between them. Practically, however, modern training programs bearing those names emphasize completely different pursuits. While analysts specialize in exploring what’s in your data, statisticians focus more on inferring what’s beyond it.

Disclaimer: This article is about typical graduates of training programs that teach only statistics or only analytics, and it in no way disparages those who have somehow managed to bulk up both sets of muscles. In fact, elite data scientists are expected to be full experts in analytics and statistics (as well as machine learning)… and miraculously these folks do exist, though they are rare.

Image: SOURCE.

Human search engines

When you have all the facts relevant to your endeavor, common sense is the only qualification you need for asking and answering questions with data. Simply look the answer up.

Want to see basic analytics in action right now? Try Googling the weather. Whenever you use a search engine, you’re doing basic analytics. You’re pulling up weather data and looking at it.

Even kids can look facts up online with no sweat. That’s democratization of data science right here. Curious to know whether New York is colder than Reykjavik today? You can get near-instant satisfaction. It’s so easy we don’t even call this analytics anymore, though it is. Now imagine trying to get that information a century ago. (Exactly.)

When you use a search engine, you’re doing basic analytics.

If reporting raw facts is your job, you’re pretty much doing the work of a human search engine. Unfortunately, a human search engine’s job security depends on your bosses never finding out that they can look the answer up themselves and cut out the middleman… especially when shiny analytics tools eventually make querying your company’s internal information as easy as using Google Search.

Inspiration prospectors

If you think this means that all analysts are out of a job, you haven’t met the expert kind yet. Answering a specific question with data is much easier than generating inspiration about which questions are worth asking in the first place.

I’ve written a whole article about what expert analysts do, but in a nutshell they’re all about taking a huge unexplored dataset and mining it for inspiration.

“Here’s the whole internet, go find something useful on it.”

You need speedy coding skills and a keen sense of what your leaders would find inspiring, along with all the strength of character of someone prospecting a new continent for minerals without knowing anything (yet) about what’s in the ground. The bigger the dataset and the less you know about the types of facts it could potentially cough up, the harder it is to roam around in it without wasting time. You’ll need unshakeable curiosity and the emotional resilience to handle finding a whole lot of nothing before you come up with something. It’s always easier said than done.

Here’s a bunch of data. Okay, analysts, where would you like to begin? Image: Source.

While analytics training programs usually arm their students with software skills for looking at massive datasets, statistics training programs are more likely to make those skills optional.

Leaping beyond the known

The bar is raised when you must contend with incomplete information. When you have uncertainty, the data you have don’t cover what you’re interested in, so you’re going to need to take extra care when drawing conclusions. That’s why good analysts don’t come to conclusions at all.

Instead, they try to be paragons of open-mindedness if they find themselves reaching beyond the facts. Keeping your mind open crucial, else you’ll fall for confirmation bias — if there are twenty stories in the data, you’ll only notice the one that supports what you already believe… and you’ll snooze past the others.

Beginners think that the purpose of exploratory analytics is to answer questions, when it’s actually to raise them.

This is where the emphasis of training programs flips: avoiding foolish conclusions under uncertainty is what every statistics course is about, while analytics programs barely scratch the surface of inference math and epistemological nuance.

Image: Source.

Without the rigor of statistics, a careless Icarus-like leap beyond your data is likely to end in a splat. (Tip for analysts: if you want to avoid the field of statistics entirely, simply resist all temptation to make conclusions. Job done!)

Analytics helps you form hypotheses. It improves the quality of your questions.

Statistics helps you test hypotheses. It improves the quality of your answers.

A common blunder among the data unsavvy is to think that the purpose of exploratory analytics is to answer questions, when it’s actually to raise them. Data exploration by analysts is how you ensure that you’re asking better questions, but the patterns they find should not be taken seriously until they are tested statistically on new data. Analytics helps you form hypotheses, while statistics lets you test them.

Statisticians help you test whether it’s sensible to behave as though the phenomenon an analyst found in the current dataset also applies beyond it.

I’ve observed a fair bit of bullying of analysts by other data science types who seem to think they’re more legitimate because their equations are fiddlier. First off, expert analysts use all the same equations (just for a different purpose) and secondly, if you look at broad-and-shallow sideways, it looks just as narrow-and-deep.

I’ve seen a lot of data science usefulness failures caused by misunderstanding of the analyst function. Your data science organization’s effectiveness depends on a strong analytics vanguard, or you’re going to dig meticulously in the wrong place, so invest in analysts and appreciate them, then turn to statisticians for the rigorous follow-up of any potential insights your analysts bring you.

You need both!

Choosing between good questions and good answers is painful (and often archaic), so if you can afford to work with both types of data professional, then hopefully it’s a no-brainer. Unfortunately, the price is not just personnel. You also need an abundance of data and a culture of data-splitting to take advantage of their contributions. Having (at least) two datasets allows you to get inspired first and form your theories based on something other than imagination… and then check that they hold water. That is the amazing privilege of quantity.

Misunderstanding the difference results in lots of unnecessary bullying by statisticians and lots of undisciplined opinions sold as a finished product by analysts.

The only reason that people with plenty of data aren’t in the habit of splitting data is that the approach wasn’t viable in the data-famine of the previous century. It was hard to scrape together enough data to be able to afford to split it. A long history calcified the walls between analytics and statistics so that today each camp feels little love for the other. This is an old-fashioned perspective that has stuck with us because we forgot to rethink it. The legacy lags, resulting in lots of unnecessary bullying by statisticians and lots of undisciplined opinions sold as a finished product by analysts. If you care about pulling value from data and you have data abundance, what excuse do you have not to avail yourself of both inspiration and rigor where it’s needed? Split your data!

If you can afford to work with both types of data professional, then hopefully it’s a no-brainer.

Once you realize that data-splitting allows each discipline to be a force multiplier for the other, you’ll find yourself wondering why anyone would approach data any other way.

By Cassie Kozyrkov

Head of Decision Intelligence, Google. ❤️ Stats, ML/AI, data, puns, art, theatre, decision science. All views are my own. twitter.com/quaesita

Sourced from Towards Data Science

Sourced from PHYS ORG

Are we hooked like digital junkies or can we wean ourselves away from the screens which dominate our lives?

Between distractions, diversions and the flickering allure of a random suggestion, the major computer platforms aim to keep us glued to our screens come what may. Now some think it is time to escape the tyranny of the digital age.

Everyone staring for hours at a screen has had some exposure to “captology”—a word coined by behavioural scientist BJ Fogg to describe the invisible and manipulative way in which technology can persuade and influence those using it.

“There is nothing we can do, like it or not, where we can escape persuasive technology,” this Stanford University researcher wrote in 2010.

All of us experience this “persuasive technology” on a daily basis, whether it’s through the endlessly-scrollable Facebook or the autoplay function on Netflix or YouTube, where one video flows seamlessly into another.

“This wasn’t a design ‘accident’, it was created and introduced with the aim of keeping us on a certain platform,” says user experience (UX) designer Lenaic Faure.

Working with “Designers Ethiques”, a French collective seeking to push a socially responsible approach to digital design, Faure has developed a method for assessing whether the attention-grabbing element of an app “is ethically defensible.”

In the case of YouTube, for example, if you follow the automatic suggestions, “there is a sort of dissonance created between the user’s initial aim” of watching a certain video and “what is introduced to try and keep him or her on the platform,” he says.

Ultimately the aim is to expose the user to partner advertisements and better understand his tastes and habits.

Dark patterns

UX designer Harry Brignull describes such interactions as “dark patterns”, defining them as interfaces that have been carefully crafted to trick users into doing things they may not have wanted to do.

“It describes this kind of design pattern—kind of evil, manipulative and deceptive,” he told AFP, saying the aim was to “make you do what the developers want you to do.”

One example is that of the newly-introduced EU data protection rules which require websites to demand users’ consent before being able to collect their valuable personal data.

“You can make it very, very easy to make people click ‘OK’ but how can you opt out, how can you say ‘no’?”

Even for him, as a professional, it can take at least a minute to find out how to refuse.

In today’s digital world, attention time is a most valuable resource.

“The digital economy is based upon competition to consume humans’ attention. This competition has existed for a long time but the current generation of tools for consuming attention is far more effective than previous generations,” said David SH Rosenthal in a Pew Research Center study in April 2018.

“Economies of scale and network effects have placed control of these tools in a very small number of exceptionally powerful companies. These companies are driven by the need to consume more and more of the available attention to maximise profit.”

Internet as tool, not trap

Faure suggests that for a design to be considered responsible, the objective of the developer and that of the user must largely line up and equate to the straightforward delivery of information.

But if the design modifies or manipulates the user, directing them towards something they did not ask for, that should then be classed as irresponsible, he says.

French engineering student Tim Krief has come up with a browser extension called Minimal, which offers users a “less attention-grabbing internet experience” on the grounds that the internet “should be a tool, not a trap”.

The extension aims to mask the more “harmful” suggestions channelled through the major platforms.

An open source project, the extension should “make users more aware about such issues”, Krief says.

“We don’t attribute enough importance to this attention economy because it seems invisible.”

Design as a defence

But is this enough to fight the attention-grabbing tactics of powerful internet giants?

Brignull believes some designers can bring about change but are likely to be restricted by the wider strategy of the company they work for.

“I think they will have some impact, a little impact, but if they work in companies, those companies have a strategy… so it can be very difficult to have an impact on the companies themselves.”

Isabelle Falque-Pierrotin, former head of the French Data Protection Authority (CNIL) also believes that design can be used to effect positive change.

“Design could be another defence whose firepower could be used against making individuals the ‘playthings'” of developers, she said in January in a presentation on the “attention economy.”

Faure says he has seen a growing demand for an ethical approach to digital design and thinks his method could help “bring better understanding between users of services and the people who design them.”

This type of initiative “could be a way to tell the big platforms that such persuasive designs really bother us,” Krief says.

Sourced from PHYS ORG

Sourced from arch daily

 

What can you learn from enterprising firms who push tech to new limits? It is time to be inspired to experiment with innovative technology that supports BIM. The software that opens projects up to unlimited possibilities is the one that helps you benefit from ground-breaking techniques. For firms, using ARCHICAD, 3D modeling photorealism and VR experiences are more than gimmicks. These technologies are part of a powerful toolset that opens the door to unlimited possibilities. Hear from the firms who have unlocked that power in By Design: The Next Frontier.

“The Next Frontier” highlights an inspiring approach to design, coordination and project management – rooted in BIM and enabled by the design flexibility found in ARCHICAD. Three firms explore the way every aspect of their design process can be used to collaborate with the structural engineer, inform the contractor and the owner on a higher level.

Their approach to communicating design and maintaining open collaboration changed their workflow for the better. Including 3D modeling, photorealism, VR and AR – all supported in ARCHICAD – enhances the conversations they have with clients, it enriches the process and leads to new work. Technological advances partnered with the power of ARCHICAD allows firms to leverage technology, help their clients be more engaged and make their projects more efficient.

Three Architecture Firms Explore the Benefits of BIM in "By Design: The Next Frontier"

Communicating changes, cost, quantities, sequencing in addition to great design add up to running the project efficiently. Contractors want that added level of clarity, all stakeholders want the accurate and complete data set. A smoother, more efficient process is out there for you and your client – engineers and architects working in the same virtual model to name just one benefit.

Be inspired by the revolutionary firms who embrace the fascinating things that come from having the right combination of technology and innovation.

Sourced from arch daily

By

The social network paid people to monitor their phone activity and Apple was not happy

Facebook and Apple are in another fight over privacy and data after reports surfaced on Wednesday that Facebook built a consumer research app that opened a backdoor to iPhones. The phonemaker, which disabled the app, has accused the social network of violating its app rules.

Apple and Facebook have had a contentious relationship since Apple CEO Tim Cook took a hardline stance against data-collection practices of internet ad giants, calling for more regulations in the industry. Facebook then hired a public relations firm to push back against the criticism of its business model.

The latest episode in the saga is a bit hard to follow. To help, here’s our guide to what happened.

The Facebook Research App
Facebook recruited phone users to install a consumer research app that tracked their web traffic, messaging, app usage and more. About 5 percent of the participants were younger than 18, according to Facebook. (Minors were prompted to get permission from parents during the download process, for what that’s worth.) The app program was managed by third party companies uTest, BetaBound and Applause, which helped distribute the app.

Quick cash for consumers
People who participated in the consumer research typically received $5 to $10 to download the app and up to $20 a month to keep it active. It was almost like a multilevel data marketing deal because people could also make money for each person they referred, and then extra money each month that those people kept the app active. According to online commenters who say they participated in the program, people could potentially even make hundreds of dollars a month. (Facebook did not respond to a request for comment.)

Why does Apple care?
In August, amid a privacy backlash against Facebook, Apple shut down a similar app from Facebook called Onavu, which also collected details about people’s phone usage. Apple said it violated its App Store policies, and no apps should collect data about other apps people have on their phones.

Facebook’s workaround
The new research app avoided Apple’s App Store by using a program that Apple created for enterprise customers. Companies like Facebook use the enterprise program to build internal company apps, apps for communication, transportation and other logistics useful to employees. However, the apps in the enterprise program are only for employees.

Who the fallout is affecting
Perhaps the people most affected at this point are Facebook employees. Apple not only disabled the research app, it shut down all of Facebook’s other utility apps for employees, reportedly leading to some chaos at the office. Facebook has said it’s talking to Apple about getting its internal apps back online.

Without the internal app program, Facebook will have trouble beta testing changes to its main apps, as well, like when it tries out a new design on Instagram or a new feature on WhatsApp, but only among employees.

Also on the case: lawmakers
Lawkmakers have added this issue to the host of others that led Congress to call CEO Mark Zuckerberg and COO Sheryl Sandberg to testify before them last year. On Wednesday, Sen. Mark Warner, D-Virginia, issued a statement that said, “I have concerns that users were not appropriately informed about the extent of Facebook’s data-gathering and the commercial purposes of this data collection.”

What about those consumers?
Everyone who participated were aware they were participating in market research, according to Facebook. Also, Google and other companies have similar research programs. Nielsen employs thousands of everyday Americans to share their TV viewing habits for market research.

On the other hand, it’s hard to tell if Facebook adhered to the strictest standards of disclosure, and how well-informed participants were. And Facebook already has been under a microscope for privacy and data-sharing issues, most notably the Cambridge Analytica scandal. There have also been questions raised about how Facebook handled user privacy and data, especially in its early days.

Bottom line
No advertiser will pull their money from Facebook over this, but they will call their ad agency and ask what the hell is happening, again.

Feature Image Credit: Bloomberg

By

Sourced from AdAge

By

2019 is set to see ecommerce sales increase by 19.5% globally, offering an opportunity to savvy brands who are up to speed on the latest web design trends and developments to drive significant additional market share.

But what do brands need to bear in mind in 2019 to ensure that they continue to deliver relevant standout online design, and therefore sales?

Mobile First

It’s vital to implement mobile first design in 2019. In 2015 mobile searches overtook those on desktop, making mobile search the highest search form worldwide. In accordance with this, Google has changed which sites they index first — they now prioritise mobile sites over those that aren’t mobile friendly.

However, it’s worth bearing in mind that this push toward mobile first design isn’t just based on ranking factors or SEO, the visual result must enhance the user’s experience on the device that they will most likely be searching from.

This focus on mobile first requires a fundamental shift in the way that websites are designed. It used to be that a site would only be created for a desktop or laptop computer and a mobile-friendly or mobile responsive design might be added as well. Today, it’s critical to design the site for the mobile user first, before creating a version that will also standout for those on desktops.

Micro-animations/movement

Using moving micro-animations along with feedback loops – that deliver movement when hovering over an icon – help make websites more usable and engaging. The details of the micro-interactions: the button clicks and the page transitions can greatly improve a user’s experience on your site, meaning they are far more likely to return. It’s this meaningful motion, connecting an action with a reaction, that satisfies a user’s desire for interactivity. And with touch interfaces, especially on small screens, it has never been more important to deliver motion in micro-animations and feedback loops to make the interaction smooth and guide users on their journey to checkout.

Custom and classic fonts

Expect a move back to custom and classic font design – clean but formal – with bigger and bolder typefaces, and a move away from humanist fonts as brands aim to standout against the proliferation of humanist typefaces.

Colour

Bright colours should be used more liberally in 2019 to deliver greater standout. The last two years has seen an explosion of big, bold colour across the internet with an increasing number of brands choosing to use their core packaging brand colours as backing for their graphics, with clashing tones moving away from the edgy start-ups into the mainstream. Those who have embraced arresting colours include The Premier League, Sky and eBay. Though bear in mind a classic font design and bright colours won’t be suitable for all. The choice of font and colours has to be right for the values of the brand and resonate with the audience they are targeting.

Optimise for search

As is always the case, making sure the design of your website is optimised for search algorithms is vital. Developments in web design will be driven by what Google’s constantly evolving search algorithm looks for. To this end, make sure that the content being communicated is relevant to your target audience and written as naturally as possible. Google looks for honest, human generated content. Of course, this must be quality content to encourage others to have weblinks back to your site to aid your SEO efforts. If users want to share your copy this highlights to Google that you are a valuable resource and the reward for your efforts will be an improved organic search ranking.

Speed

With research revealing over half of consumers leave a website if it takes more than three seconds to load, websites must be designed with speed in mind. Also, the faster your site loads the better it will rank in search results, particularly in Google search. This is not to say that websites should be sparse affairs with limited content and imagery for the purposes of speed. With better broadband it’s much easier to have image and content heavy sites that can load quickly. However if you have an app it’s seriously worth considering hosting it on a Progressive Web App (PWA) for speed purposes. A PWA can be launched from a home screen and can be ready in less than a second, often beating native apps in load times.

All brands need to constantly evolve their web design to continue to standout and deliver an engaging experience to their users that generates sales. By recognising and having these six web design points front of mind, brands will be well placed for a profitable 2019 online.

By

James Pruden is studio director at Xigen

Sourced from The Drum