Website page weight is a conditional value that characterizes the “importance” of a web page. A combination of static and dynamic page weight. There is also the weight of a website page, which is measured in kilobytes. find out the weight of a page in kilobytes based on the amount of code on the page. This is not directly related to SEO optimization. Although the weight of a site’s page (in kilobytes), of course, can affect the site’s loading speed, but that’s not the point now.
Let's figure out what page weight is, what it can be, and how static page weight differs from dynamic page weight . This will bring us closer to understanding what internal linking of site pages is.
What is static page weight
determine the weight of a page (static) by calculating the sum of the total weight that the pages of the site transfer to each other. Let's assume that all pages on the site have the same priority, equal to one. This priority is called weight (by analogy with authority among people, the more authoritative a person is, the more weight his opinion has). Using links, pages transfer static weight to each other. To check the weight of a page, you need to know that the more links there are on a page, the less weight each link conveys.
For example:
— one link on the page transmits weight 1; — two links on the page convey a weight of 0.5 each; — three links on a page divide the weight by three; and so on.
To determine the weight of a site page, you also need to take into account not only the initial static page weight , which is equal to one. The page transfers all static weight, including the weight that other pages have transferred to it. That is, the page does not transmit 1, as in the original version, but its current weight. This is where the question arises - how to find out the weight of a website page ?
Determining the weight of website pages is a relatively difficult task. Because all the pages of the site influence each other, transferring their weight over and over again. To check the weight of a website page online, you can use special services and programs to determine the weight of pages , of which there is an abundance.
What was done
High-frequency queries were selected from the semantic core (the list of queries on our website). Based on them, an anchor list was compiled (a list of requests for internal links). Links with these anchors were placed on pages with the highest weight (for example, with a Link Score of 100) and led to promoted pages (for example, the home page).
Let's look at the example of a travel agency website. The privacy policy page receives the most weight on the site. We select the high-frequency anchor “travel agency” and design it as a link to the main page. That is, we find or add the text:
Our travel agency invites you to read our privacy policy.
We format the anchor as a link in this way:
your anchor
Static weight of the main page of the site
It is logical to assume that the largest static weight of a page is the weight of the site’s main page . The main page accumulates the largest link juice of the site's pages. This usually happens due to the fact that links from all pages of the site lead to the main page. In addition to the main page, the top-level sections of the site receive significant weight. This is where the reason for the highest priority of the main page and main sections of the site lies. This reason is dynamic page weight .
How to set page size in HTML?
In the browser, you can do this: press CTRL and - sequentially. After this, the page
will decrease while maintaining the width of the document.
Interesting materials:
What photocopies are needed for registration? What benefits are given for 3 children? What are the benefits for newlyweds? What benefits are there for large families? What benefits does a single mother have? What benefits are available to large families in the Moscow region? What benefits are provided for a young family? What benefits does a father of three children have? What cars are considered luxury? What environmental protection measures can be proposed?
What is dynamic page weight
Dynamic page weight - changeable or movable page weight. Dynamic page weight is the result of site user activity. Namely, clicks on links on the site. The more and more often users click on a link, the more authority the page has. Similar values are used when conducting an SEO audit of a website and analyzing behavioral ranking factors. It is not possible to measure page weight
Even search engines analyze complex factors differently. But a certain reflection of the dynamic weight of a page in the form of information about “Popular Pages” can be obtained using statistics services, for example: “Yandex Metrica” or “Google Analytics”. In addition to internal dynamic link weight, there is also external link weight. The so-called “citation index” (CI) and “thematic citation index” (TIC). These values can be found out quite accurately through the “services for webmasters” Google or Yandex.
How much does the Internet weigh? We tell you how data grows in volume
Have there been studies on this topic?
There are several ways to calculate the mass of the Internet, at least scientists differ on the methods.
The first method for calculating the mass of the Network, proposed by Russell Seitz, uses data on the number of servers supporting its operation (from 75 to 100 million according to various sources), their average power consumption (from 350 to 550 W), the average voltage in their logic circuits (3 B) and clock frequency (1 GHz).
A current of 1 Ampere produces a flow of approximately 1,018 electrons per second. A direct calculation shows that the Internet as a whole is powered by the movement of just over 50 grams of electrons.
Russell Seitz, Principal Investigator
The staff of the American popular science magazine Discovery held a different point of view.
Here's how they reasoned: the long chain of ones and zeros in which the transmitted document is encrypted is split into packets ranging in size from several tens to several hundred bytes to travel across the network. Each package is also accompanied by an address where it should be sent, and a number, which allows the packages to be correctly assembled into a single whole at the point of receipt. Along the way, these packets pass through many computers, and in each they are briefly retained in memory and analyzed. Then their further path is determined and they are sent onward.
Both calculations are based on the electron rest mass (9.1*10−31 kg). According to the same Russell Seitz, to “feed” the flow of 50 grams of electrons, about 50 million horsepower must be applied.
How to calculate mass?
If this short article is sent by email, it will take about 25 KB (text only, no pictures). There are 1,024 bytes in a kilobyte, and there are 8 bits in a byte. Therefore, the volume of the article in bits is 205,000. We can assume that half of them are ones, half are zeros. This means there are 102,500 units, and each is represented by 40 thousand electrons. In total, approximately 4 billion electrons were used to write this article. The mass of an electron is 9.11.10 28 grams, multiply and you get the mass of this text in the computer memory.
But it's only one email. According to data for 2008, all information sent per day weighed 0.0057 milligrams. And another third of this weight must be added if we want to take into account not only the exchange of files between users, but also the information requested from sites.
How has activity changed today?
According to Internet Live Stats, every second more than 50,000 searches are made on Google, 120,000 YouTube videos are watched, and nearly 2.5 million emails are sent. Yes, it’s very impressive, but still these data do not allow us to fully imagine the size of the Internet.
In September 2014, the total number of sites exceeded a billion, and today there are approximately 1.018 billion. But the so-called “deep web” has not yet been counted, that is, a collection of sites not indexed by search engines: the content there can be completely harmless (for example, online databases) and completely illegal (for example, black market marketplaces accessible only through Tor). Although the Tor browser is used not only by violators, but also by users who want anonymity.
Please note that the above estimate of the number of websites is approximate. Sites come and go, and the size of the deep and dark webs is almost impossible to determine. Therefore, it is very difficult to even approximately estimate the size of the network using this criterion. But one thing is certain - the network is constantly growing.
One way to assess information circulating on the Internet is to measure traffic. According to Cisco, by the end of 2016, 1.1 zettabytes of data will be transferred worldwide. And in 2019, the volume of traffic doubled, reaching 2 zettabytes per year.
But how can you try to imagine 1021 bytes? One zettabyte is equivalent to 36,000 years of HDTV video. And it will take 5 years to watch the video transmitted around the world every second.
How is this information presented on physical media?
Despite the rise of the digital age, for many of us, bits and bytes remain somewhat abstract concepts. Previously, memory was measured in megabytes, now in gigabytes. What if we try to imagine the size of the Internet in some tangible form?
In 2015, two scientists proposed using real A4 paper pages for assessment. Using data from the aforementioned WorldWideWebSize service as a basis, they decided to count each web page as equivalent to 30 paper pages. We received 4.54 x 109 x 30 = 1.36 x 1011 A4 pages.
But from the point of view of human perception, this is no better than the same bytes. Therefore, the paper was tied to the Amazon jungle. According to the authors' calculations, to produce the above amount of paper, 8,011,765 trees are needed, which is equivalent to 113 km2 of jungle, that is, 0.002% of the total area of the Amazon.
Although later the Washington Post suggested that 30 pages was too much, and one web page would be more correctly equated to 6.5 A4 pages. Then the entire Internet can be printed on 305.5 billion sheets of paper.
But all this is true only for text information, which does not occupy the largest share of the total data volume. According to Cisco, in 2015, video alone accounted for 27,500 PB per month, and combined website, email and “data” traffic accounted for 7,700 PB.
File transfer accounted for slightly less—6,100 PB. In case anyone has forgotten, a petabyte is equal to a million gigabytes. So the Amazon jungle will not allow us to imagine the volumes of data on the Internet.
The 2011 study mentioned above suggested visualization using CDs. According to the authors, in 2007, 94% of all information was represented in digital form - 277.3 optimally compressed exabytes (a term for data compression using the most efficient algorithms available in 2007).
If we record all this wealth on DVD (4.7 GB each), we will get 59,000,000,000 discs. If we assume the thickness of one disk to be 1.2 mm, then this stack would be 70,800 km high.
For comparison, the length of the equator is 40,000 km, and the total length of the Russian state border is 61,000 km. Moreover, this is the amount of data as of 2007. Now let’s try in the same way to estimate the total volume of traffic that is predicted for this year - 1.1 zettabytes. We get a stack of DVDs 280,850 km high. Here it’s time to move on to cosmic comparisons: the average distance to the Moon is 385,000 km.
How the volume of information will change: scientists' forecasts
University of Portsmouth researcher Melvin Wopson estimates that digital information could account for half the mass of the Earth by 2245 at the current rate of increase. The scientist published his article in the journal AIP Advances.
Wopson builds on the principle of equivalence of mass and energy in Einstein's general theory of relativity, as well as the work of Rolf Landauer, who applied the laws of thermodynamics to information, and the research of Claude Shannon, who invented the bit.
According to Wopson, in about 130 years, the energy required to support the process of creating digital information will equal all the energy currently produced on planet Earth, and by 2245, half the Earth's mass will be converted into "digital information mass."
The scientist made this conclusion, since humanity uses resources such as coal, oil, natural gas, copper, silicon and aluminum to create and maintain the operation of huge computer farms and process digital information, this leads to a redistribution of earthly matter from physical atoms to digital information - the fifth state of matter along with liquid, solid, gas and plasma.
Eventually, according to the author of the new paper, we will reach a state where the number of bits created by man will exceed the number of atoms on Earth. According to Vopson's calculations, this will happen within 150 years, given the current rate of increase in the amount of information at 50% per year.
The growth of digital information seems unstoppable. According to IBM and other big data companies, 90% of the world's information today was created in the last 10 years alone. In some ways, the current COVID-19 pandemic has accelerated this process as it has enabled us to produce and consume more digital content than ever before.
Melvin Whopson, University of Portsmouth Fellow
The physicist said that in 130 years, the energy required to support the process of creating digital information will be equal to all the energy that is currently produced on Earth. The scientist also noted that 90% of the world's information that exists today was created over the past ten years.
Read also
Treasure hunter finds 3,000-year-old treasure in Scotland
Perseid meteor shower 2022: where to see it, where to look and how to take a photo
Scientists have developed the first algorithm to understand quantum noise
Landing Page weight
It’s not hard to guess that one-page landing pages often fall out of the overall picture. They are deprived of the ability to transfer the weight of site pages, deprived of static and dynamic weight of pages. This means that the priority of the main landing page is maximum and has optimal page weight . In the end, creating a landing page carries the task of optimizing user actions and shortening the path to the goal. Therefore, such a concept as “Landing Page SEO” looks strange for one-page landing pages. Moreover, the “behavioral factor” is becoming more and more important when ranking a site in search results. Along with content and links.
Why is this necessary?
Websites also need to lose weight for the summer. The larger the page, the slower it loads, the longer the user does not get access to the content, and the worse they look in the eyes of search engines. Finding out the size of the page and the entire site is worth it for two reasons: to evaluate how much space you have left on your hosting and how quickly it loads in the browser. The second is especially important - page loading speed is taken into account by search engines when generating results. The corresponding algorithm has been working since 2010, and if the site is heavy and slow, you shouldn’t count on getting to the top.
What page size is optimal for search engines?
For search engines, both “sheets” of hundreds of kilobytes and fake pseudo-pages, on which the only content is a photo of the product and its price, are equally bad. But tons of code and piles of links spent on this! This is the fault of unoptimized online stores, which have been leaving search results in a friendly line since the beginning of 2010. The optimal amount of indexed text on a page is from 2 to 10 kilobytes. At the same time, it is best to keep the total weight of the page
There is an SEO legend that service HTML codes are not taken into account by search engines. Yeah, of course... Well, it’s a small matter: we create a page with half a megabyte of code and three lines of content. Suicides, go! Let us explain for normal people: all other things being equal, the page that uses fewer service codes per unit of useful information will be ranked higher in the search. This is one of the important ranking factors , which, in the conditions of fierce competition on the Internet, only Rockefeller’s grandson can discount. And even then only in RuNet: in the dot-com of such multi-million dollar corporations - on the first three pages in Gula on commercial topics.
The content of the page should be as unique as possible. Even within the site! That is, a paragraph of text repeated on all pages is not useful, but direct harm, and this is the second warning to young webmasters. Grandfather Stalin would have sent the webmaster to the camps for this; Well, Internet search engines are more humane: they throw out such duplicates from the search, that’s all! Therefore, when optimizing your site, try to avoid identical pieces of text content at all costs. If you can’t avoid it, synonymize it; If you don’t know how, read this site, or don’t use “fuzzy duplicates” (there is such a term) on your pages.
The main reasons for the appearance of very large pages
- SVG images (“scalable vector graphics”) are built into the page code - this is a type of graphics that is created using a mathematical description of the geometric elements that form all the details of the future image. That is, in this format it is not the picture itself that is stored, but the instructions for constructing it.
- Built-in information designed to debug the site.
- Contains malware that makes downloading difficult. These could be remnants of deleted viruses or codes added after the site was hacked.
- JavaScript is inserted into the HTML code of the page, and not placed in a separate file.
- After inserting elements from another document onto the page, redundant CCS styles were added to the HTML code for each element, taking up a lot of space. This happens, for example, when text is pasted from a Word document without clearing the tags.
- Built-in code designed for mining cryptocurrencies. This can be done not only by the owner of the resource, but also by attackers who hack the site.
What to do if the site loads slowly
Cut off the excess
If the only function of an element is to be beautiful, go ahead and cut it. No one will appreciate the beauty of this block or that window if you have to wait a long time for them to load. The same goes for the rest of the page elements, and not just the visual ones.
Not all at once
Load the heaviest and bulkiest items last. Yes, the site page size will not change, but the user will have faster access to useful content.
– Only high-quality traffic from Yandex and Google – Clear reporting on work and work plans – Full transparency of work
Source
What are the website size formats for layout?
Each device has several characteristics that describe its size:
- The physical size of the screen is its diagonal in inches (for example, 24 inches). Devices with the same screen size may have different resolutions.
- Resolution is the ratio of screen width to height in pixels (for example, 1920 x 1080 pixels).
- The browser window size is the width and height of the viewport in pixels (for example, 1896 x 1080 - including a 24 pixel scroll bar). In other words, this is the resolution minus borders, scrollbars, and other elements that reduce the viewable area of the site. But the user can greatly change the size of the browser window, so designers usually set the content area and side padding.
The physical size of the device can be taken into account when designing clickable elements, but basically, the size of the site is chosen based on the resolution , and the width and height of its content area (container) are determined taking into account browser indents.
You also need to remember about the orientation of the screens - for laptops, desktops and TVs it is landscape (width is greater than height), while for smartphones and tablets it is usually portrait (height is greater), but can be flipped.