How many phone numbers do you have memorized? How many places can you navigate to without using GPS?
If you’re like me, those numbers are embarrassing. I expect both would be much higher if smartphones did not exist. I am no Luddite, but many people have become so accustomed to trusting technology that they forget to think for themselves.
If you need proof, just consider one of the many examples of drivers following GPS directions by rote and causing accidents. But trusting machines can be dangerous in more nuanced ways that may affect even those of us who think we’re pretty savvy.
In an increasingly complex world, we have come to trust that a computer will turn on when you hit the power button, and it will spit out the best answer to your query when you ask. Most people don’t stop to think about how the machines around them work, nor do they necessarily need to. As machines get better at what they do, automation bias – the preference to rely on algorithms instead of human experts – is spreading. However, in a society that accepts that computers are better than humans at many tasks, it is important for individuals to understand how blind trust can hurt us. Schools teach us critical reading skills, but we need to apply that type of thinking when dealing with technology, too.
Programs and algorithms are created by humans, directly or indirectly, and thus include human biases in their results. Embedded beliefs built into algorithms can be benign or merely represent a difference in opinion between one organization and another. They can also actively manipulate you.
For example, we often engage with algorithms that allow us to identify new music, videos or Instagram influencers we might like. These algorithms are designed to help you find items that interest you; why wouldn’t that be a good thing? But the primary purpose of many of these algorithms is to increase a company’s revenue by influencing your behavior. By clicking on one cat video, you are likely to trigger suggestions for several more that will lure you into sitting in front of your screen (and watching ads) for much longer than you intended. Even if you genuinely like cat videos, did you decide to watch for that long or did you just find it easier not to look away?
Customized content feeds can also create negative effects beyond tying up too much of our attention. Being fed an endless stream of fit models or friends who seem to be living a life of leisure on Instagram might seem harmless or even inspiring to some. But those images can negatively impact people’s mental health. Between photo editing software, filters and algorithms impacting what you view, social media is far from an accurate depiction of reality. Younger or more vulnerable people especially need to learn the skills necessary to put social media posts in context and avoid using them as a personal benchmark.
Separating fact from fiction online is a critical skill for all of us. Whether on social media or elsewhere on the internet, the headlines you see are often influenced by algorithms designed to generate more clicks and increase viewership. While we may recognize the bias within a given article, how often do we question why we are seeing so many stories with the same viewpoint? Often people assume that this reflects truth or consensus. But it can be a company or algorithm trying to reflect or even manipulate your way of thinking. Encouraging people to think more deeply about where their news stream comes from will help them better understand what “fake news” really is.
Algorithms drive more than the online media we consume. Several companies now use them to make offers to buy homes. I have personally received free offers in the mail from one such company. Opendoor has offered to tell me how much they would pay for my house. While I am not looking to sell and expect they wouldn’t offer top dollar, I was tempted to find out what they would pay. I resisted that urge. Although free is a compelling price, I expected I would pay in other ways, such as being bombarded by follow-ups and adjacent marketing campaigns. When you are offered something for free, consider the true price, even if it is only your time.
Zillow is another company looking to get into the algorithmic home-buying game. The company originally became popular due to its “Zestimate” feature – a proprietary, automated price estimate of every home in America. While it is far from perfect, many people find the Zestimate to be a reasonable comparative tool for understanding what a home might be worth. If enough people are more inclined to trust a technology-based home price estimate than to consult a human being, and if those people also sell their home to that same company, it is a recipe for a substantial wealth transfer from homeowners to Zillow.
The Wall Street Journal recently highlighted the risks investors run by accepting automated investment advice as gospel. While many consumers perceive automated advice as unbiased and trustworthy, evidence supports the idea that we should pay closer attention. Even technology-led investment decisions can suffer from biases, which users should remember to question. In other areas of finance, computers have even more influence. JPMorgan estimated that just 10% of stock trading is regular stock picking by people. The rest is generated by computer algorithms, or people buying and selling index-based investments. Think about this when you wonder why the stock market goes up or down on a daily basis. In a real sense, it is because the computers said so. Also think about it when you decide it is a good time to buy or sell any individual investment based on what you heard on television or any other hunch. Computers make those investment decisions constantly. They can do so much faster and based on much more data than you can.
While I don’t think I can beat the machines when investing, I also recognize they can manipulate prices, especially for less-regulated markets like cryptocurrency. A price changing on a screen does not mean the intrinsic value of the investment has really changed. As a Certified Valuation Analyst, I know that an investment’s value heavily depends on the specific facts and circumstances of the company, and of the buyer and seller.
It is easy for many of us to identify, and therefore resist, overt attempts at manipulation. But keep your eyes peeled for subtler attempts. Whether you find yourself clicking through endless recommendations on a website, driving to a new location without grasping where you’re going, or selling your house to a tech company, remember that the computer’s goals are often not aligned with yours. Stop to think for yourself and question why you’re doing what you’re doing. Wonder, too, why you’re seeing what is in front of you and whose agenda you are following when you decide to act on it.
Senior Client Service Manager and Chief Investment Officer Benjamin C. Sullivan, who is based in our Austin, Texas office, contributed several chapters to our firm’s recently updated book,
The High Achiever’s Guide To Wealth, including Chapter 5, “Investments: Fundamentals, Techniques And Psychology,” and Chapter 14, “Employment Contracts.” He was also among the authors of the firm’s book
Looking Ahead: Life, Family, Wealth and Business After 55.
Posted by Benjamin C. Sullivan, CFP®, CVA, EA
photo by fancycrave1 via Pixabay
How many phone numbers do you have memorized? How many places can you navigate to without using GPS?
If you’re like me, those numbers are embarrassing. I expect both would be much higher if smartphones did not exist. I am no Luddite, but many people have become so accustomed to trusting technology that they forget to think for themselves.
If you need proof, just consider one of the many examples of drivers following GPS directions by rote and causing accidents. But trusting machines can be dangerous in more nuanced ways that may affect even those of us who think we’re pretty savvy.
In an increasingly complex world, we have come to trust that a computer will turn on when you hit the power button, and it will spit out the best answer to your query when you ask. Most people don’t stop to think about how the machines around them work, nor do they necessarily need to. As machines get better at what they do, automation bias – the preference to rely on algorithms instead of human experts – is spreading. However, in a society that accepts that computers are better than humans at many tasks, it is important for individuals to understand how blind trust can hurt us. Schools teach us critical reading skills, but we need to apply that type of thinking when dealing with technology, too.
Programs and algorithms are created by humans, directly or indirectly, and thus include human biases in their results. Embedded beliefs built into algorithms can be benign or merely represent a difference in opinion between one organization and another. They can also actively manipulate you.
For example, we often engage with algorithms that allow us to identify new music, videos or Instagram influencers we might like. These algorithms are designed to help you find items that interest you; why wouldn’t that be a good thing? But the primary purpose of many of these algorithms is to increase a company’s revenue by influencing your behavior. By clicking on one cat video, you are likely to trigger suggestions for several more that will lure you into sitting in front of your screen (and watching ads) for much longer than you intended. Even if you genuinely like cat videos, did you decide to watch for that long or did you just find it easier not to look away?
Customized content feeds can also create negative effects beyond tying up too much of our attention. Being fed an endless stream of fit models or friends who seem to be living a life of leisure on Instagram might seem harmless or even inspiring to some. But those images can negatively impact people’s mental health. Between photo editing software, filters and algorithms impacting what you view, social media is far from an accurate depiction of reality. Younger or more vulnerable people especially need to learn the skills necessary to put social media posts in context and avoid using them as a personal benchmark.
Separating fact from fiction online is a critical skill for all of us. Whether on social media or elsewhere on the internet, the headlines you see are often influenced by algorithms designed to generate more clicks and increase viewership. While we may recognize the bias within a given article, how often do we question why we are seeing so many stories with the same viewpoint? Often people assume that this reflects truth or consensus. But it can be a company or algorithm trying to reflect or even manipulate your way of thinking. Encouraging people to think more deeply about where their news stream comes from will help them better understand what “fake news” really is.
Algorithms drive more than the online media we consume. Several companies now use them to make offers to buy homes. I have personally received free offers in the mail from one such company. Opendoor has offered to tell me how much they would pay for my house. While I am not looking to sell and expect they wouldn’t offer top dollar, I was tempted to find out what they would pay. I resisted that urge. Although free is a compelling price, I expected I would pay in other ways, such as being bombarded by follow-ups and adjacent marketing campaigns. When you are offered something for free, consider the true price, even if it is only your time.
Zillow is another company looking to get into the algorithmic home-buying game. The company originally became popular due to its “Zestimate” feature – a proprietary, automated price estimate of every home in America. While it is far from perfect, many people find the Zestimate to be a reasonable comparative tool for understanding what a home might be worth. If enough people are more inclined to trust a technology-based home price estimate than to consult a human being, and if those people also sell their home to that same company, it is a recipe for a substantial wealth transfer from homeowners to Zillow.
The Wall Street Journal recently highlighted the risks investors run by accepting automated investment advice as gospel. While many consumers perceive automated advice as unbiased and trustworthy, evidence supports the idea that we should pay closer attention. Even technology-led investment decisions can suffer from biases, which users should remember to question. In other areas of finance, computers have even more influence. JPMorgan estimated that just 10% of stock trading is regular stock picking by people. The rest is generated by computer algorithms, or people buying and selling index-based investments. Think about this when you wonder why the stock market goes up or down on a daily basis. In a real sense, it is because the computers said so. Also think about it when you decide it is a good time to buy or sell any individual investment based on what you heard on television or any other hunch. Computers make those investment decisions constantly. They can do so much faster and based on much more data than you can.
While I don’t think I can beat the machines when investing, I also recognize they can manipulate prices, especially for less-regulated markets like cryptocurrency. A price changing on a screen does not mean the intrinsic value of the investment has really changed. As a Certified Valuation Analyst, I know that an investment’s value heavily depends on the specific facts and circumstances of the company, and of the buyer and seller.
It is easy for many of us to identify, and therefore resist, overt attempts at manipulation. But keep your eyes peeled for subtler attempts. Whether you find yourself clicking through endless recommendations on a website, driving to a new location without grasping where you’re going, or selling your house to a tech company, remember that the computer’s goals are often not aligned with yours. Stop to think for yourself and question why you’re doing what you’re doing. Wonder, too, why you’re seeing what is in front of you and whose agenda you are following when you decide to act on it.
Related posts:
The views expressed in this post are solely those of the author. We welcome additional perspectives in our comments section as long as they are on topic, civil in tone and signed with the writer's full name. All comments will be reviewed by our moderator prior to publication.