When writing websites for smaller audiences (such as intranet hosting data-driven websites that handle from a few to a few thousand users), we only tend to follow Best practices within the project budget/deadline-namely developer costs, rollout timelines and maintainability have a much greater impact on how we code things.
Some things can also be ignored ( To a certain point), such as delivery time, image compression/size, bandwidth, because the nature of LAN hosting applications often means (we don’t know) (within a reasonable range) that there are relatively few financial costs, don’t worry too much. /p>
However, when looking for a wider audience, such as (hopefully) an audience of millions of users:
>Are there any best practices that no longer need to worry about (i.e., audience The bigger it is, it can be ignored)?
>Are there any practices that should be strictly followed?
>Also, are there any practices that will only really work because your audience has reached a certain critical mass [and what would that critical mass be]? That is, applying artificial constraints that will not attract your attention on a private network
The examples I have encountered so far are:
>Host code libraries, such as jQuery on Google, because It is provided from Google’s CDN and can provide services faster than from your own server. This also helps reduce website delivery bandwidth costs.
>The reason for hosting images on a CDN is the same as hosting you elsewhere The JavaScript code is the same.
Nowadays, it seems The focus is more on providing a “good user experience”, which seems to depend on the “time of results” (e.g. a full web page on the user’s desktop): this translates into investment (among other things). More about “A “And the “P” side, then the “C” side.
More specifically: take some time to decide when to perform presentation layer data aggregation for your users, such as in recalculation Can I summarize this data for a longer period of time before another view is available for push?
Of course, I just barely grasp the surface of the problem.
Although I have asked this question in various situations before, I can’t find it. Information to any website dedicated to a very large audience – for example, the scale of hundreds of thousands or even millions of users.
While writing websites for smaller audiences (e.g. intranet hosting When dealing with data-driven websites, from a few to a few thousand users), we only tend to follow the best practices within the project budget/deadline-i.e. developer costs, rollout schedules, and maintainability code how we The impact of things is much greater.
Some things can also be ignored (to a certain point), such as delivery time, image compression/size, bandwidth, because the nature of LAN hosting applications often means (we Don’t know) (within a reasonable range) there are relatively few financial costs. Don’t worry too much.
However, when looking for a wider audience, such as (hopefully) an audience of millions of users:< /p>
> Are there any best practices that no longer need to worry about (ie, the larger the audience, it can be ignored)?
>Are there any practices that should be strictly followed?
>Also, are there any practices that will only really work because your audience has reached a certain critical mass [and what would that critical mass be]? That is, applying artificial constraints that will not attract your attention on a private network
The examples I have encountered so far are:
>Host code libraries, such as jQuery on Google, because It is provided from Google’s CDN and can provide services faster than from your own server. This also helps reduce website delivery bandwidth costs.
>The reason for hosting images on a CDN is the same as hosting you elsewhere The JavaScript code is the same.
I think it depends on people’s goals for the pressure “triangle”: CAP (consistency, availability and tolerance to partitions) . For example. People can only have so many “C”s when they encounter a network interruption that causes “P”.
Nowadays, it seems that the focus is more on providing a “good user experience” , It seems to depend on the “time of the result” (for example, there is a complete web page on the user’s desktop): this translates into investment (among other things) more about the sides of the “A” and “P”, and then the “C” “One side.
More specifically: take some time to decide when to perform presentation layer data aggregation for your users, for example, before recalculating another view for push, I can update Will this data be aggregated over a long period of time?
Of course, I just barely grasped the surface of the problem.