Some article websites (I’m looking at msn.com right now, as an example) show the first page or so of article content and then have a “Continue Reading” button, which you must click to see the rest of the article. This seems so ridiculous, from a UX perspective–I know how to scroll down to continue reading, so why hide the text and make me click a button, then have me scroll? Why has this become a fairly common practice?

  • jaschen@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    160
    arrow-down
    2
    ·
    10 months ago

    Web Manager here. Some good answers here. Let me add a few more.

    Engagement. If you land on a page and don’t engage on the page and leave, Google doesn’t even count you as a User. The more things you do on the page, Google will rank you higher.

    Data analysts: we are testing if the article is valuable or not. If nobody is clicking continue, we know that we might need to rework the article.

    Page load: The biggest and I mean biggest reason someone leaves a page is page load speed. If you’re deep in researching some information, regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site. Loading only 1/4 of the page helps with this along with other tricks like caching at the CDN and lazy loading.

    There are tons more reasons, but we found that with the “Continue” button, it wasn’t detrimental to the site performance.

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      arrow-up
      67
      arrow-down
      3
      ·
      edit-2
      10 months ago

      regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site

      As both a developer and an end user, this drives me batshit.

      Seemingly no one has figured out that if users are bouncing due to page load times, maybe the problem is actually because your page that was supposed to be, say, a recipe for a bologna sandwich doesn’t need to first load an embedded autoplaying video, an external jQuery library, a cookie notice, three time delayed popovers, an embedded tweet, and a sidebar that dynamically loads 20 irrelevant articles, and a 2600x4800 100vw headline image that will scroll up at half speed before the user can even get any of the content into the viewport. Just a thought. I don’t care what your dog-eared copy of Engagement For Dummies says. It is actually wrong.

      I have made the business I work for quite successful online by taking all of the alleged “best practices” things that clearly annoy the shit out of everyone, and then just not doing those things.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        I hate with a passion how when looking up recipes, you gotta go through like 5 pages of why they like it, a fluffed up but useless how it’s made, all sorts of other shit, and only then do you get the actual fucking ingredient list and cooking temperatures and the actual cooking instructions.

        I HATE IT SO MUCH!

        • Case@lemmynsfw.com
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Don’t forget the long winded tales of how their distant relative they never met gave them the recipe from the “old country” or some shit.

          Dude, I just needed to see what temperature to set the oven to.

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        It depends on the site. A recipes site is trying to get as many impressions as possible so they can either turn a profit or keep the lights on.

        If your company doesn’t rely on ads to stay afloat, the site experience is better.

        If you dislike the page, exit the page within 10ish seconds without clicking anything and you will hurt the page’s SEO ranking.

    • anothermember@lemmy.zip
      link
      fedilink
      arrow-up
      24
      ·
      10 months ago

      Page load: The biggest and I mean biggest reason someone leaves a page is page load speed. If you’re deep in researching some information, regardless of your internet speed or if the fault is on the user side and your page load is over 3 seconds, you will leave the site. Loading only 1/4 of the page helps with this along with other tricks like caching at the CDN and lazy loading.

      The thing that always bothers me about this is that I’ve been using the internet since 90s dial-up, and even 90s dial-up never had a “page load speed” problem when loading text-based articles. An extremely conservative estimate is that modern broadband speeds are 1000x what they were then so “page load speed” is entirely about the design of the website, and it seems that mostly the excuse is “we want to spy on people”. Am I wrong? Otherwise why not write an HTML page that would be just as compatible with Geocities as it would now?

      • jas0n@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        10 months ago

        You can still write plain html websites, and they would be super fast! But that’s not how we do things damnit! I need to implement feature x. Do I spend all day rolling my own lean version? Fuck no. I download a 5-ton JavaScript library that already has that feature, and I fuck off the rest of the day.

        You are correct on one thing. The math does not add up at all.

        The root cause is the current meta of software development. It’s bloat. Software is so ungodly bloated today because we’ve been taught since as long as I can remember that hardware is so fast nowadays that we don’t need to care about performance. Because of this mindset, many of the best practices that we were taught work directly against performance (OOP was a mistake. Fight me).

        There might be overhead on the ad tracking bullshit… Sure. But, if developers cared about performance, that ad tracking can be fast, too ;]

        How long should it really take to render a webpage? That should be near instant. If modern games can render a full 3D landscape over 100 times a second, surely a wall of text and some images can be done in under 1 second, right?

        This is a problem in all software. For a simple example, I remember Microsoft word from 20 years ago being quite snappy on the desktops of the time. And by comparison, we are running supercomputers today. A cheap android phone would blow that desktop out of the water. Yet, somehow, word is a dog now…

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Some of my clients do not have the budget to give you free content without ads. Even a (usable)shared hosting server costs around 25 bucks a month. Add in dev time and design, small mom and pop sites can’t afford to be ad free.

        Only the big dogs do paywalls.

    • eatthecake@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      That’s funny, I always thought ‘continue reading’ was a paywall button going to a subscription page and just back right out

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        10 months ago

        Then the article isn’t strong enough and will be rewritten. The more relevant it is in your search, the higher chance you will continue reading.

        • eatthecake@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          10 months ago

          I’m not sure you understand me. I assumed that the continue reading button would ask me to pay and since I am not going to pay I never continued reading.

          • jaschen@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            10 months ago

            Ahhh, I think you might be an edge case. The users we tested this on all understood what was going to happen after.

    • funkless_eck@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      ·
      10 months ago

      Also, a lot of websites are built on CMS that has [Read More]… baked in. eg wordpress is designed around the concept of an excerpt of each page/post as it was built 30 years ago. Although as others have pointed out, the time/data savings are minimal - that mattered when wordpress was invented and is a vestigial part of the system.

      • randombullet@programming.dev
        link
        fedilink
        arrow-up
        8
        ·
        10 months ago

        Because many are served by a 3rd party CDN that’s more robust than the original article.

        Also might be part of the coding.

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        10 months ago

        As I mentioned, small mom and pop shops can’t afford to give you free content without ads. So they prioritize the ad so they can get paid for the impression.

        Unfortunately the content is not free to create and maintain.

    • Iamdanno@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      As a person who knows nothing about web development, can you not load the pages in smaller chunks, so that the first screen or two worth of stuff loads fast and the rest could load while you are looking at it. That way, to the user, it appears to load quickly enough to keep them from leaving?

      • AA5B@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        10 months ago

        It’s a bullshit excuse - a couple pages of text loads in a second or two in even poor connections. Their optimizing for ads and tracking

        Let me correct my other comment here: I miss when a 9600 baud modem was fast but holy crap has the internet gone downhill. Now get off my lawn

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        You can, but you would have to do it through scripting which would rely on whatever methodology you’re using not breaking with browser updates and standards changes, whether or not the user has scripting enabled to begin with, whether not their adblockers or other plugins mess it up, etc. And then you can wind up just deferring the issue. Let’s say the user intends to quickly skim through your page to see if it actually appears to contain what they’re looking for or whether it’s just SEO bullshit, so they scroll down right after the first chunk loads and hit the point where the next chunk should load, and unexpectedly find that it didn’t do so instantly (because it probably won’t) and it appears your content cut off mid-page. They’ll assume your site is just broken and you’ve never seen another user hit that back button so fast.

        So the answer is “yes, but,” and may not be worth the trouble.

        Clicking a “continue reading” button is not an ideal solution either, but at least the user will (should) realize that they’ve performed an action that will load more content, as opposed to having it happen behind their backs in a manner that they weren’t initially aware.

        • Anamnesis@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Yeah this shit annoys the hell out of me with certain websites where I’m trying to ctrl-f information. It hasn’t loaded the whole page until I scroll down, so my search ends up being worthless.

      • Ross_audio@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        You lose backwards compatibility with web browsers if you do that.

        It also doesn’t help reader apps or plugins, SEO or various other things to have the site stream the text instead of just loading it.

        Basically it moves you from standard thing everything understands to non-standard thing which might break. It’s just not worth it.

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        What you’re talking about is called lazy loading. It loads text first and CSS and then images after.

        Most modern sites now do this along with needing to load it at all until you hit the continue button. That not only reduces your browser load, it also reduces server load as well.

        There are many other reasons to have the continue button, but the positives outweigh the negative. It’s not considered a dark pattern and helps the content team improve on their content.

    • CluckN@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      Interesting, is it tough to keep up with Google’s SEO? I’ve seen some weird blogs ranking extremely high for basic searches.

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        I started my career in SEO and moved into web Manager because it was just too tiring keeping up with Google. I think my last update that I could remember was called “Panda”. This is when they named their updates.

        My current SEO strategy is super simple. Have the content you’re writing for relate as much as possible to the user intent. Give the user what they are looking for FAST and then crosslink, cross sell after. You will have a good page.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        They’re constantly tweaking it, partly to stay ahead of the blogspam farms who make thousands of low quality or total bullshit pages just trying to get clicks for ads.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        10 months ago

        Google offers an analytics package that a huge amount of sites embed. Many other companies like Facebook have software available as well. Mostly people have these to track performance of Google-published ads, but it gathers a LOT more data than that. You also don’t need to use their ad system to put it on your site.

        Anyway, it runs JavaScript to gather information about everything that a visitor does on the site and sends it to Google. You can “opt out” by using a browser extension like NoScript. I assume ad blockers could work too.

        For people developing or running a site, it really gives you a ton of useful information - where your visitors are from, what pages people viewed, how they got to your site (search terms, ads, referrers), how long they spend on your site, even a “heat map” that shows what parts of the page people hovered on with their mouse pointer. The tradeoff is that Google gets all of this information too.

    • gramie@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      page load

      It would be fine if they only loaded a partial page so that it will render in my browser quicker.

      However, what usually happens is that the entire page loads, then an overlay pops up to get me to register or pay, or whatever.

      Being a web developer, it’s not hard for me to inspect the page and remove the overlay so I can read everything, but it is an annoyance.

  • ArgentRaven@lemmy.world
    link
    fedilink
    arrow-up
    132
    arrow-down
    2
    ·
    10 months ago

    It’s two fold:

    1. it’s good proof of “user interaction with site” to sell to advertisers

    2. they can use that to load more ads or refresh current ones after it loads more text, and you’re already bought in on the story so you’re likely going to keep going.

    I suspect a third reason is to try adding other news stories at the end in case the current one didn’t grab your attention, but that doesn’t seem to be as consistent amongst sites that I’ve seen do this. I run ad blockers though, so I don’t really see the sites the way they expect me to.

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        arrow-up
        46
        arrow-down
        1
        ·
        10 months ago

        Nah that’s not it. The text content is an infinitesimal portion of a modern Web page.

        Many webpages are > 1mb, that’s a million letters if you will.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          Articles usually have images and possibly embedded videos. So it’s not just text.

          Even so, a decent webserver wouldn’t really care.

          Maybe it loads faster for mobile users though if you only load text and a single image at first.

          • fine_sandy_bottom@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            I’m not sure what you’re getting at.

            The comment I replied to said that maybe the “read more” button is an effort to conserve bandwidth by only sending half the text.

            I said that the text is such a tiny portion of the bandwidth required to transmit a web page that it wouldn’t make sense to try conserving it by only sending half.

            You’re absolutely correct in that only sending images on the visible part of the page is a common way to conserve bandwidth.

      • pathief@lemmy.world
        link
        fedilink
        arrow-up
        11
        ·
        10 months ago

        The cost of making a new request for the rest of the news is higher than just returning the full news. The only use case where this makes sense is where news are behind a paywall and you just want to show a teaser to Anonymous readers.

        • Zagorath@aussie.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          The only use case where this makes sense is where news are behind a paywall

          It can be particularly good in soft-paywall situations, where you want to give people a certain number of clicks per month before they have to start paying.

          I don’t think I’ve ever actually seen these “keep reading” buttons used in that way, though.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    45
    arrow-down
    1
    ·
    10 months ago

    My guess is that this gives them data they can analyze on how many people actually read the page that far.

        • FuglyDuck@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          10 months ago

          That’s exactly how that works.

          Increased traffic to your website increases the value of ad space on that website and they pay you more for it; the “read more” button is one tool to demonstrate MSN or whoever actually has traffic.

          It’s far from the only tool, but it is one tool, and is part of the analytics they’re running. It also shows some amount of engagement, that people are actually readin the article rather than clicking on it and forgetting about.

          • 1rre@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            10 months ago

            I mean the best way to increase the value of your ad space is to have a small but visible amount and to produce content good enough that advertisers come to you, rather than the other way around

            The issue there is that it takes effort to produce good content and it’s easier to just paraphrase existing/ai generate new content, which results in a “read more” button (unrelated to a “enter your email to read more” option which is 100% for advertising as a replacement for 3rd party cookies, and allows for users to see and decide exactly what websites to share their identity with as an active decision, rather than shadier stuff behind the scenes like cookies or fingerprinting where they’re tracking you without you even knowing, so expect to see a lot more of it as they go away)

            • FuglyDuck@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 months ago

              And why does good content cause advertisers to come to you?

              Traffic. The more people come through your site, the more valuable the ad-space, the more they’re willing to pay.

              Good content in niche areas will also increase value, yes, but there’s a reason websites pay for SEO services…

              • 1rre@discuss.tchncs.de
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                Yes, but not as exclusively as you might think. There’s an increasing number of manually vetted “premium” sites (for better or worse, as it reduces SEO spam while also making it harder for good but niche content to break through) which provide actually good content, as irritated people looking for a sentence in a multipage article aren’t going to look kindly on ads, whereas engaged people reading good content will

  • Spzi@lemm.ee
    link
    fedilink
    arrow-up
    18
    ·
    10 months ago

    Just a guess: to prevent bots from scraping the full content?

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      10 months ago

      Doubt it. My web analytics indicate that bots click on every single element on the page, whether it makes sense or not.

      For this reason it’s a good idea not to allow your site to generate any kind of circular self-referential loop that can be achieved via navigation or clicking on things, because poorly coded bots will not realize that they’re driving themselves around in circles and proceed to bombard your server with zillions of requests per second for the same thing over and over again.

      Likewise, if you have any user initiated action that can generate an arbitrary result set like for example adding an item or set of items to a quote or cart, it is imperative that you set an upperbound limit on the length of result or request size (server side!), and ideally configure your server to temp-ban a client who attempts too many requests that are too large in too short of a time span. Because if you don’t, bad bots absolutely will eventually attempt to e.g. create a shopping cart with 99999999999999999 items in it. Or a search query with 4.7 gigabytes worth of keywords. Or whatever. Either because they’re coded by morons or worse, because they’re coded by someone who wants to see if they can break your site by doing stuff like that.

      • petrol_sniff_king@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        10 months ago

        it’s a good idea not to allow your site to generate any kind of circular self-referential loop that can be achieved via navigation or clicking on things

        Don’t nearly all sites have a logo at the top that will take you back to the homepage? I’m not really following.

        My intuition is that the only safe solution is to rate limit requests; a poorly coded bot could definitely just be a while loop for the same URL ad infinitum.

        [e] Unless there’s something to this I’m not thinking about.

  • redcalcium@lemmy.institute
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    10 months ago

    Apparently it can boosts engagement?

    At the Times, which got 60 percent of its June visitors from mobile, the “show full article button” has resulted in “moderate increase” in the time readers spend, according to Paul Werdel, senior product manager on mobile.

    Quartz, which also introduced its own “read full story” button alongside its design refresh in June, has used the button to boost the performance of its mobile Engage ads, which appear directly below the button. The Huffington Post uses a similar approach, presenting readers with a 300 x 250 banner ad below its own “read more” button. Huffington Post VP of Engineering Sam Napolitano said that preliminary data on the feature has been “very positive” since its addition.

    https://digiday.com/media/publishers-mobile-truncated-page/

  • Phil K@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago
    1. Some people prefer pages to scrolling (it’s amazing the strength of opinion about this for either point of view)
    2. Advertisements are charged per impression. So each page counts as a new impression
    3. Be grateful websites no longer auto scroll web pages
    4. Some things lend themselves to page by page. For example very long articles (this is why books replaced scrolls)
    5. Be grateful that websites stopped animating page turns etc
    6. Sometimes web developers don’t care and just use a bought in package
    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      I definitely scroll for web pages, but my ebook reader apps all give the option to do scrolling and I can’t stand anything but pages for a book, so I get it.

      (I do do most of my reading on eInk, so obviously scrolling wouldn’t work there. But I don’t do it exclusively. I read some on my phone and iPad as well and scrolling books feels awful.)

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      10 months ago

      it’s amazing the strength of opinion about this for either point of view

      Regardless of user preference, on the web it’s a fool’s errand to try to force your content into a page-by-page format. Designing a paged content presentation that’s guaranteed to work on every device with every screen size is so close to impossible that it’s not worth bothering. And that’s before you take into account whether or not the user prefers to view in landscape or portrait, what aspect ratio their screen has, what zoom level their browser is set to, or even how their browser implements zoom and content rescaling. So 9 times out of 10, you’ll wind up with your content being broken into pages that the user still has to scroll around in to see all of anyway. Or where everything will be illegibly microscopic. Or both! Especially if they turn up on a mobile device – which is something like 92% of all web users these days. Every time you fail you will annoy the shit out of your user base, and you will fail more often than you succeed.

      No, the sole driving factor behind sites breaking articles into “pages” is so they can load more ads on each page change.

      (This does not apply to non-website mediums, obviously. IDGAF how you digest your e-books, manga, or whatever.)

  • JimmyBigSausage@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    10 months ago

    Because they want you to obnoxiously see as many ads as possible because they don’t care if you read the article, only view ads. This is the new shitty web. MSN, Newsweek and Yahoo are the scummy kings.

  • oktoberpaard@feddit.nl
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    Maybe to make the article seem shorter, so you’re more inclined to keep reading. Once you’re halfway through, you’re more likely to want to read the rest. Both halves are probably filled with ads, so the longer you stick around, the better.

  • KptnAutismus@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    7
    ·
    10 months ago

    while data collection and advertisement is a big part of it, they probavly try to “save” on bandwidth, you might not read the entire article.