Continuous Deployment for Java / JVM Web Application

My team and I would like to implement Continuous Deployment for our site. Continuous deployment basically means deploying to production very often (several times a day). Presumably, Ezi does this all the time.

Our environment is Tomcat + Nginx. We are already running a continuous deployment of any code change on our snapshot server (i.e. Traditional Continuous Integration) using Hudson and the Hudson + Cargo plugin, which is hot deployed.

Surprisingly, this works well (although over time we sometimes have to restart tomcat).

For production, this will not work because we do not have a website. I have some ideas, like having two web applications and redirecting until one of them works.

Does anyone have any ideas or have done this before in a real production environment?

+4
source share
3 answers

From http://radar.oreilly.com/2009/03/continuous-deployment-5-eas.html :

Real time warning. No matter how good your deployment process is, errors can still go through. The most annoying variety are errors that are not displayed until the hours or days are deployed. To catch these nasty bugs, you need a monitoring platform that can tell you when things went awry and get a person involved in debugging them.

To effectively implement continuous deployment in production, you need good monitoring, otherwise you will not understand what is happening with your application.

+2
source

I don’t know WHY you think this is a good idea, but it is up to you.

I would use a balancing application with two hot systems, which can be found in tomcat itself, and then just stop the server before deploying, deploying and restarting the server. Leave a two-minute window for each hot server, and you should be good.

EDIT: I would not use EVERYTIME. We are also a small company with a lot of QA (tm), but still one click in the build system to live.

0
source

We use apache httpd 2.2 and mod_proxy for this

Then we run 2 tomcats, one on port 8080 and one on port 88. The firewall prevents external access to these ports, so only port 80 is open

Apache HTTPd configured to listen on port 80

It is very easy to set up. This is the basic configuration (httpd.conf) that will work out of the box:

LoadModule proxy_module modules/mod_proxy.so LoadModule proxy_balancer_module modules/mod_proxy_balancer.so LoadModule proxy_http_module modules/mod_proxy_http.so <Proxy balancer://mycluster> BalancerMember http://localhost:8080 BalancerMember http://localhost:88 status=+H </Proxy> ProxyPass / balancer://mycluster/ ProxyPassReverse / balancer://mycluster/ 

"+ H" means that it is used only as a backup server, so when the 8080 is unavailable, it will work at 88 until the 8080 returns to the network

0
source

Source: https://habr.com/ru/post/1340624/


All Articles