Some article websites (I’m looking at msn.com right now, as an example) show the first page or so of article content and then have a “Continue Reading” button, which you must click to see the rest of the article. This seems so ridiculous, from a UX perspective–I know how to scroll down to continue reading, so why hide the text and make me click a button, then have me scroll? Why has this become a fairly common practice?
Web Manager here. Some good answers here. Let me add a few more.
Engagement. If you land on a page and don’t engage on the page and leave, Google doesn’t even count you as a User. The more things you do on the page, Google will rank you higher.
Data analysts: we are testing if the article is valuable or not. If nobody is clicking continue, we know that we might need to rework the article.
Page load: The biggest and I mean biggest reason someone leaves a page is page load speed. If you’re deep in researching some information, regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site. Loading only 1/4 of the page helps with this along with other tricks like caching at the CDN and lazy loading.
There are tons more reasons, but we found that with the “Continue” button, it wasn’t detrimental to the site performance.
Page load: The biggest and I mean biggest reason someone leaves a page is page load speed. If you’re deep in researching some information, regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site. Loading only 1/4 of the page helps with this along with other tricks like caching at the CDN and lazy loading.
The thing that always bothers me about this is that I’ve been using the internet since 90s dial-up, and even 90s dial-up never had a “page load speed” problem when loading text-based articles. An extremely conservative estimate is that modern broadband speeds are 1000x what they were then so “page load speed” is entirely about the design of the website, and it seems that mostly the excuse is “we want to spy on people”. Am I wrong? Otherwise why not write an HTML page that would be just as compatible with Geocities as it would now?
You can still write plain html websites, and they would be super fast! But that’s not how we do things damnit! I need to implement feature x. Do I spend all day rolling my own lean version? Fuck no. I download a 5-ton JavaScript library that already has that feature, and I fuck off the rest of the day.
You are correct on one thing. The math does not add up at all.
The root cause is the current meta of software development. It’s bloat. Software is so ungodly bloated today because we’ve been taught since as long as I can remember that hardware is so fast nowadays that we don’t need to care about performance. Because of this mindset, many of the best practices that we were taught work directly against performance (OOP was a mistake. Fight me).
There might be overhead on the ad tracking bullshit… Sure. But, if developers cared about performance, that ad tracking can be fast, too ;]
How long should it really take to render a webpage? That should be near instant. If modern games can render a full 3D landscape over 100 times a second, surely a wall of text and some images can be done in under 1 second, right?
This is a problem in all software. For a simple example, I remember Microsoft word from 20 years ago being quite snappy on the desktops of the time. And by comparison, we are running supercomputers today. A cheap android phone would blow that desktop out of the water. Yet, somehow, word is a dog now…
The biggest reason I leave a page is paywalls and ads.
Some of my clients do not have the budget to give you free content without ads. Even a (usable)shared hosting server costs around 25 bucks a month. Add in dev time and design, small mom and pop sites can’t afford to be ad free.
Only the big dogs do paywalls.
That’s funny, I always thought ‘continue reading’ was a paywall button going to a subscription page and just back right out
Then the article isn’t strong enough and will be rewritten. The more relevant it is in your search, the higher chance you will continue reading.
I’m not sure you understand me. I assumed that the continue reading button would ask me to pay and since I am not going to pay I never continued reading.
Ahhh, I think you might be an edge case. The users we tested this on all understood what was going to happen after.
I’m part of your edge case too.
I also back out of pages that have this, for the same reason.
I’ve also assumed the same. There’s no way it’s a rare enough edge case not to be impactful
Why do ads (videos with loud sound) always load before any meaningful parts of the page?
Because many are served by a 3rd party CDN that’s more robust than the original article.
Also might be part of the coding.
As I mentioned, small mom and pop shops can’t afford to give you free content without ads. So they prioritize the ad so they can get paid for the impression.
Unfortunately the content is not free to create and maintain.
All those big newspaper websites are small mom and pop shops? TIL…
Not all, but I have plenty of 1 or 2 people sites that are purely ad based for income.
Also, a lot of websites are built on CMS that has [Read More]… baked in. eg wordpress is designed around the concept of an excerpt of each page/post as it was built 30 years ago. Although as others have pointed out, the time/data savings are minimal - that mattered when wordpress was invented and is a vestigial part of the system.
page load
It would be fine if they only loaded a partial page so that it will render in my browser quicker.
However, what usually happens is that the entire page loads, then an overlay pops up to get me to register or pay, or whatever.
Being a web developer, it’s not hard for me to inspect the page and remove the overlay so I can read everything, but it is an annoyance.
As a person who knows nothing about web development, can you not load the pages in smaller chunks, so that the first screen or two worth of stuff loads fast and the rest could load while you are looking at it. That way, to the user, it appears to load quickly enough to keep them from leaving?
It’s a bullshit excuse - a couple pages of text loads in a second or two in even poor connections. Their optimizing for ads and tracking
Let me correct my other comment here: I miss when a 9600 baud modem was fast but holy crap has the internet gone downhill. Now get off my lawn
What you’re talking about is called lazy loading. It loads text first and CSS and then images after.
Most modern sites now do this along with needing to load it at all until you hit the continue button. That not only reduces your browser load, it also reduces server load as well.
There are many other reasons to have the continue button, but the positives outweigh the negative. It’s not considered a dark pattern and helps the content team improve on their content.
How does Google know if you interacted with something on the page?
Google offers an analytics package that a huge amount of sites embed. Many other companies like Facebook have software available as well. Mostly people have these to track performance of Google-published ads, but it gathers a LOT more data than that. You also don’t need to use their ad system to put it on your site.
Anyway, it runs JavaScript to gather information about everything that a visitor does on the site and sends it to Google. You can “opt out” by using a browser extension like NoScript. I assume ad blockers could work too.
For people developing or running a site, it really gives you a ton of useful information - where your visitors are from, what pages people viewed, how they got to your site (search terms, ads, referrers), how long they spend on your site, even a “heat map” that shows what parts of the page people hovered on with their mouse pointer. The tradeoff is that Google gets all of this information too.
Chrome. You’re likely using their product. They know everything.
I am not using Chrome.
Well, the majority of people do.
regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site
As both a developer and an end user, this drives me batshit.
Seemingly no one has figured out that if users are bouncing due to page load times, maybe the problem is actually because your page that was supposed to be, say, a recipe for a bologna sandwich doesn’t need to first load an embedded autoplaying video, an external jQuery library, a cookie notice, three time delayed popovers, an embedded tweet, and a sidebar that dynamically loads 20 irrelevant articles, and a 2600x4800 100vw headline image that will scroll up at half speed before the user can even get any of the content into the viewport. Just a thought. I don’t care what your dog-eared copy of Engagement For Dummies says. It is actually wrong.
I have made the business I work for quite successful online by taking all of the alleged “best practices” things that clearly annoy the shit out of everyone, and then just not doing those things.
Enshittification
My guess is that this gives them data they can analyze on how many people actually read the page that far.
Just a guess: to prevent bots from scraping the full content?
Because they want you to obnoxiously see as many ads as possible because they don’t care if you read the article, only view ads. This is the new shitty web. MSN, Newsweek and Yahoo are the scummy kings.
Maybe to make the article seem shorter, so you’re more inclined to keep reading. Once you’re halfway through, you’re more likely to want to read the rest. Both halves are probably filled with ads, so the longer you stick around, the better.
Removed by mod