How to install / run Spark Java Framework on AWS Elastic Beanstalk?

I usually create a Java web application using Tomcat or Glassfish in a WAR (web application archive file). This file can be easily deployed to AWS through Elastic Beanstalk with a few clicks. Integration is easy because Elastic Beanstalk allows us to deploy a web application to Tomcat / Glassfish / Java.

I recently started using a lightweight java infrastructure called Spark, www.sparkjava.com. Is it possible to deploy it on an elastic beanstalk in a few clicks? If not, is there an alternative to easily deploying the Spark Java web application on AWS? Or do I need to create an EC2 instance, copy the JAR application and run the JAR web application through the command line?

+5
source share
2 answers

Of course, you can either deploy it like a war through the Tomcat platform, or run it as a thick can using the Java SE platform .

+6
source

I published Archetype for AWS Elastic Beanstalk, which uses DropWizard instead of Spark (but both use Jetty). It should be pretty easy to adapt it to Spark. Please note that it uses the Docker Solution Stack (more flexible and easily debugged locally). This command should start:

$ mvn archetype:generate -Dfilter=elasticbeanstalk-docker

In addition, AWS allows you to create a custom Java application, such as @ k.liakos, said in his answer.

0
source

Source: https://habr.com/ru/post/1238456/


All Articles