Web Pages

  sunnystaines 10:12 20 Mar 12

when broadband first came in web pages opened very fast, a great change from the from the 56k modem.

But as time as moved on broadband and browsers have got faster yet so many web pages have got slower to open. Is it too many ads, tracking cookies etc, or too many users online at the same time choking the net. when i was with Bulldog it was a max 25 users per equipment at the exchange but an engineer told me it is now nearer 50-100 users.

any one with any views

  lotvic 11:02 20 Mar 12

Yes, all of the above. I think of it like the roads system. First they changed some A roads to dual carriageway then when some of those couldn't cope and long delays Motorways were built, etc. etc. More Cars capable of faster speeds but if roads can't handle the traffic everything slows down. Ah well, the British are supposed to be used to queuing up and waiting for everything... ;-)

  interzone55 11:12 20 Mar 12

I think the problem is down to how web pages are formed these days.

It's no longer a simple HTML page with images, now the pages are generated on the fly from several different sources.

If you look at this page the basic background is one part of the page, loaded from one server.

The post is from a database, likely to be on another server.

The ads between the thread and the reply box are from Google's servers.

The links down the side are from yet another server.

The page also has active links to Google+, Twitter, Tumblr, Diggit & Reddit.

Each server will deliver data much faster than a server from the inital ADSL era, but assembling the page takes a little while

  sunnystaines 12:56 20 Mar 12


thanks never knew pages were that complex

  wiz-king 13:05 20 Mar 12

Just for fun - if you have time to spare - try using this site on a dial-up modem, you reply to a post and find 5 people have beaten you to it.

  Aitchbee 13:20 20 Mar 12

wiz-king - I have mobile boadband & dial-up modem, internet connections,operating on two computers, side by side.

My money is on...'dial-up'....50p e/w

  interzone55 14:28 20 Mar 12

Just thought of something else that slows web pages down

Your security software.

Many packages now scan pages as they're loaded, so all the sections I described above are scanned, then every single link on the page is also scanned, just to check that no nasties lie in wait.

I can't even begin to count the number of links on this page

  Batch 18:07 20 Mar 12

Following on from alan14's first post, I think you will find that web page content is much richer than it used to be (in so many ways). In fact, if you were to go back to web pages as they were all those years ago I suspect that they would seem quite crude by today's standards. Here's an example of an unsophisticated website that should be more like the old days click here (it may take a few secs to initially fully load as it has a Java based menu system, but if you don't have Java installed it should still work).

Also, Javascript (not Java, which is something different) is used a lot more in web pages these days (rather than vanilla HTML). Javascript is typically heavier on the CPU, particularly with older browsers (e.g. IE8 or earlier which has a very poor Javascript engine).

  john bunyan 18:16 20 Mar 12

sunnystaines Not sure if it speeds it up much but since I installed DNT+ I have had no tracking cookies in my SAS scans. On this site at the moment it is blocking 11 tracking sites : 3"Social Buttons"; 2 Ad Networks: and 6 Companies. In a few weeks it has blocked over 17000 trackers. Might be worth a try.


  Forum Editor 18:33 20 Mar 12

"when i was with Bulldog it was a max 25 users per equipment at the exchange but an engineer told me it is now nearer 50-100 users."

You're referring to what's called the contention ratio - the maximum number of users who may theoretically be contending for a share of the available bandwidth on the backhaul link from your local exchange to the BT network. Domestic ADSL users are normally on a 50:1 contention ratio - the 20:1 connections are for business customers.

It gets very complicated, but in essence the greater the number of people on the same backhaul link, the less bandwidth per person. As the amount of bandwidth available to you decreases the pages you request will load more slowly. At peak times you may be contending with up to 49 other people, although in practice that rarely happens.

Bandwidth is just one of the ingredients in the mix - the distance between your house and the exchange plays its part, as does the quality of the copper line along which all that data has to travel. Complex web sites may load more slowly, for the reasons already mentioned by other contributors.

  sunnystaines 19:03 20 Mar 12

john thanks for the link, fe some good reading thanks

This thread is now locked and can not be replied to.

Nintendo Switch review: Hands-on with the intuitive modular console and its disappointing games…

1995-2015: How technology has changed the world in 20 years

New Corel ParticleShop plugin now available: 11 new brushes & 6 new brush packs

Apple AirPods review: Apple's beautiful new Bluetooth headphones bring true intelligence to…