Hosting multiple rails services on one server + website architecture with api support

I just finished reading Paul Dixโ€™s book, Service Oriented Design with RoR, and I would like to build a Rails 3 web application based on what I just learned.

I think I got the basic architecture correctly, but a simple question like blocking me: how should I host multiple REST services on the same server?

Here's how I see things at the moment:

  • Create * Utility applications (UserService, XYZFeatureService, ...) based on Sinatra (I think) that provide REST endpoints for accessing resources
  • Have a front-end Rails application with controllers / views / ... that consume data from different services. End users could access it through http://www.myapp.com , for example.
  • And finally, have a standalone "API" application for handling calls https://api.myapp.com/* or https://www.myapp.com/api/* for publishing an external API that will use the same services with possible authentication, throttling, etc. on top of it.

Does this sound like a good start for you?

As for the implementation, from what I read in the book, I plan to create gems to handle the connection between the rails application and the services (I can throw in RabbitMQ, but that's another story).

However, since I only have one physical server, I wonder how I am going to combine all these applications / services? My first guess is to run each service application on localhost: xxxx, where xxxx is the other unprivileged port for each service. I can configure each client stone in my rails application to use these ports.

Along with this, I would probably run Apache 2 + Passenger to serve my rails and API services, using something like Rack :: URLMap (or virtual hosts when using a subdomain) to direct requests to the right of the application. Should I use Passenger to run my services in a production environment?

Is this the right way ?! It feels in accordance with what I read and found out, and, if necessary, easily crashed into several physical servers, but I would like to be sure that I will not miss anything. Do you build things differently?

Thanks so much for your input!

Update

The main questions I want to answer are:

  • Is the architecture described suitable for building a web application with external API endpoints?
  • Can I use services on the same server on different ports?

Thanks!

+6
source share
2 answers

So, this question is a little older than three years, and I think that he could get a reasonable objective answer.

It's funny to read this question again and see what it was recently considered when the simple answer to the โ€œhigh levelโ€ is true: do what you need / need to do!

There is no magic rule for observation, although I assume that what I was looking for at that time. However, there are some key points to keep in mind:

  • Developing a service-oriented architecture means we are getting ready for scaling. Each service is designed to work independently and does not depend on how it runs on the same server as other services in the stack. Do not bind your services, they must remain independent

  • However, do not persuade this: the temptation is high to spend a lot of time developing the ideal architecture when what you really need to do is send your v1!

  • When you create separate services, do not make them more complex than necessary: โ€‹โ€‹a simple web stack with REST (-like) endpoints is likely to be sufficient for a start. RabbitMQ and other message queues are great too, but they solve problems you may not have.

  • As for servers, then ... in an ideal world that you need a server for a service, everything is in the data center, with replication through a second (or more!) Set of servers to another physically separated data center ... it takes time and money to set up and maintain. If you are in a large organization, this may be good, but if so, you probably did not need to read this answer.
    So yes, you can start small! One or two servers, or "big" with virtualized servers on it ... it all depends on your confidence that you are conducting business or recruiting a system administrator. One machine may be enough, and feel free to run several services on it if all of them can use the same system and memory.
    Today I would probably use nginx to send requests to the correct services, depending on hostnames or ports, and start private services on different ports using a firewall (e.g. Shorewall) to block requests from outside on those ports.

Here it is ... as I said, there is no magic answer, but solutions to solve each problem need to be solved. What I learned over the past 3 years, working mainly on medium / large projects, is that the beginning of the simple is key.

0
source

I use Apache-Passenger compilation and a script (see below), but I read a lot about the tests that click Node.JS behind the Nginx load balancer - and at least to provide the web services API, it might make sense.

my script:

 def build_a_new_oxenserver() site = siteurl.gsub(/\./,"_") system( "rake add_site['#{siteurl}','#{site}','#{id}']") if Rails.env.production? default_tmpl = open(File.expand_path(Rails.root + "public/default_template.html")).read tmpl = Template.create(:ox_id=>id, :name=>"first template", :content=>default_tmpl) pg=Page.create( :ox_id=>id, :language_id=>1, :template_id=>tmpl.id, :title=>"Home", :menu_label=>"Zu Hause", :ancestry=>nil, :root=>true) # add the Paragraph element to this ox toolbox self.elements << Element.find(1) # add an Article, a Paragraph, and a Post pe = PageElement.create( :element_id => Element.find(1) ) pe.elementable = Paragraph.create(:content=>"This is written *in bold* -") pe.save pg.page_elements << pe end 

rake file add_site performs a remote task on the production server - creating the necessary folders, configuration files and related scripts to run a new "instance". Thus, I can expand my services and with little effort, I can also expand the possibilities of load balancing.

Please note that this solution is open source.

The rake script looks like this:

 # # rake add_site["www.domain.tld", "www_domain_tld", 131] desc "Task for adding new oxenserver site" task :add_site, :domain, :site, :ox_id do |t, args| service_path = /data/www/html/app_service site_string = %{ <VirtualHost *:80> ServerName #{args[:domain]} DocumentRoot #{service_path}/sites/#{args[:site]}/public PassengerAppRoot #{service_path}/sites/#{args[:site]} SetEnv OX_ID #{args[:ox_id]} <Directory #{service_path}/sites/#{args[:site]}/public> AllowOverride all Options -MultiViews </Directory> </VirtualHost> } File.open("tmp/#{args[:site]}.conf", "w") do |f| f.write site_string end site_start = %{ mv #{service_path}/current/tmp/#{args[:site]}.conf /data/apache/conf.d/#{args[:site]}.conf service httpd restart } File.open("tmp/#{args[:site]}.sh", "w") do |f| f.write site_start end # sites_dir = "#{service_path}/sites/#{args[:site]}" shared_sites_dir = "#{service_path}/shared/sites/#{args[:site]}" shared_oxen_dir = "#{service_path}/shared/sites/oxen" current = "#{service_path}/current" # prepare system files/directories system "mkdir #{sites_dir} #{shared_sites_dir} #{shared_sites_dir}/public" system "cd #{sites_dir} && cp -rus #{current}/* ." system "cd #{shared_sites_dir}/public && cp -r #{shared_oxen_dir}/public/* ." system "cd #{shared_sites_dir} && mkdir log tmp && cd #{sites_dir} && rm -rf public log tmp && ln -s #{shared_sites_dir}/public public && ln -s #{shared_sites_dir}/log log && ln -s #{shared_sites_dir}/tmp tmp" system "cd #{sites_dir} && touch tmp/restart.txt log/production.log" system "mv tmp/#{args[:site]}.sh public/system/job_queue/#{args[:site]}.sh" end 
+1
source

Source: https://habr.com/ru/post/904323/


All Articles