Getting error: Route () in Route cannot be applied to String

I am developing a Java-based MongoDB application and I ran into a problem when working with Spark.

package com.tengen;

import spark.Request;
import spark.Response;
import spark.Route;
import spark.Spark;

public class HelloWorldSparkStyle {
    public static void main(String[] args) {
        Spark.get(new Route("/") {
            @Override
            public Object handle(Request request, Response response) {
                return "Hello World From Spark";
            }
        });
    }
}

An Route("/")error appears in the new one Route() in route cannot be applied to java.lang.string.

I am confused why this does not work, as I followed their code exactly. enter image description here

+4
source share
5 answers

This should probably be posted on the MongoDB class forum, but I ran into a similar problem. The getter method seems to have changed from the moment the course material was prepared. Now you need a path and route to get

get (path, route)

import spark.Request;
import spark.Response;
import spark.Route;
import spark.Spark;

public class HelloWorldSparkStyle {
    public static void main(String[] args){

        Spark.get("/", new Route() {
                public Object handle(final Request request, final Response response){
                return "Hello World from Spark";
            }
        });
    }
}
+25
source

spark-core-1.1.1.jar, , , Spark ( 2.0.0) . , , spark-core-1.1.1.jar ,

+1

Spark 1. Spark 2 Spark (: http://sparkjava.com/news.html):

import static spark.Spark.*;

public class HelloWorld {
    public static void main(String[] args) {
        get("/", (req, res) -> "Hello World From Spark");
    }
}
0

Spark Java Route. :

public static void main(String[] args) {
    spark.Spark.port(PortNumber);
    Spark.get("/", new Route() {
            public Object handle(Request request, Response response) throws Exception {
                return "This is a sample page";
            }
        });
}

, "/" - . spark.Spark.port(PortNumber).

0

Change the version of Spark in the POM file from the exercise files that you download from the handout. This is a fixed issue for me.

0
source

Source: https://habr.com/ru/post/1542236/


All Articles