Server-side processing or client-side processing?

I have developed a comparative website to compare any product sold online in India. Currently, the site is fully client: -

  • accepts user input.
  • Makes 20-30 AJAX requests and receives results from all major online stores.
  • Using some client-side scripts to sort the results and display them in the most appropriate way.

Disadvantages: -

  • My client code is available to everyone. Its javascript.
  • More prone to browser errors.
  • Not reliable.

Disadvantages after creating the server side: -

  • Considering the traffic of my website, server load will increase, as it will work with the client for a longer period.
  • Retrieving values โ€‹โ€‹from various websites can take up to 10 seconds (maximum). Server during this time. Consider the load if I have 500 visitors / min at peak time.

Benefits: -

  • My codes are safe and secure.
  • Client-side processing will be minimal. Will work even on mobile phones and other devices.

I want to analyze these problems before they are actually implemented. Can someone suggest me what should I choose for my site? Which approach would be better for me?

Please comment if my question is ambiguous.

+4
source share
2 answers

Well, first of all, this is a very good question.

it completely depends on the volume and transactions processed by your site, and if you really want your application to scale, I want you to do everything right! and do it from the very beginning.

Stop putting your business logic on the client side ** do not expect to use the end-user network bandwidth when he makes a comparative call :) and do not expect that he has the best bandwidth.

Download the server farm balance . make sure that your server farm is balancing the load correctly and that the comparison is performed using multiple threads instead of a single thread.

caching results ** if you do this on the server side, if user a and user b are asking for the same comparison, you can actually pull it out of cache for user b, repeat these requests instead.

Usability show the progress of comparison with the user, showing a downloadable rotating image :)

hope this helps

+3
source

I donโ€™t understand why you take "More prone to browser errors." Javascript will work exactly the way you want it if you send the correct libraries / code to the browser.

You can see which browser executes the request and sends the correct part of the code to it (read this: if you have code that will work in FF and Opera, but not in IE, write two versions of this code, one for each group of browsers. ) There are several libraries that can help you.

Also, comparison is not something that needs to be done on the server side. So, if you have a lot of traffic on your site, client-side work should be great for these kinds of things.

About code protection, you're right. The server side will not allow anyone to read your code. Therefore, you must decide whether server loading is more important than the people reading your code.

Also keep in mind that you can confuse your code. I know that this is not the best solution, but it will not allow many people not to read it.

EDIT:

The creation of the server side will lead to the fact that all functions will work on all devices (as already mentioned). But there are some things you need to do to keep the load on your server up to 100%.

Chandra Sekhar Walajap has already told you about the benefits of using your cache for results. I personally would go a little further, as you simply refuse all of these pages.

I would create a scraper that would run, say, every 24 hours and retrieve / dump each product. Then I would save all these products somewhere (read both in the database and anywhere). Thus, when a user makes a comparison request between products A and B, you will not need to select / delete all sites, instead you just need to search for these products in the place where you saved them and show them to the user.

Please note that this will save a lot of bandwidth only if you have many users comparing many products.

On the other hand, if you have several users who are looking for only a few products, then this solution will not do you any good, because you know that you get everything from all sites every 24 hours of use, both in the CPU and in bandwidth.

About server loading: I can not say. It depends on what is almost impossible to say. You need to think about how big the site you are recycling is, how many products on each website? What equipment do you use? You would be better off making a simple example using cURL and extracting everything from one of the sites. Then see how it affects performance and decides.

+2
source

Source: https://habr.com/ru/post/1436300/


All Articles