Endpoints

Endpoints are objects that represent the server-side of http transactions. They are instances of the DeproxyEndpoint class. Endpoint receive HTTP requests and return HTTP responses. The responses are generated by handlers, which can be static functions, object instance methods, or closures. Handlers can be set when the endpoint is created, or specified on a per-client-request basis. Endpoints are created using the addEndpoint method of the Deproxy class. (Instantiating a DeproxyEndpoint object via the construct is discouraged.)

Endpoints are very flexible. You can create intricate testing situations with them in combination with custom handlers.

How To: Single Server

In the simplest case, a proxy sits in between the client and a server.

 ________                         ________                        ________
|        |  --->  Request  --->  |        |  ---> Request  --->  |        |
| Client |                       | Proxy  |                      | Server |
|________|  <---  Response <---  |________|  <--- Response <---  |________|

Since we’re testing the proxy, we want to be able to control the responses that the server sends in reaction to requests from the proxy. We can simulate the server using a single DeproxyEndpoint. By default, the endpoint will simply return 200 OK responses, unless it is given a different handler upon creation.

Deproxy deproxy = new Deproxy()

DeproxyEndpoint endpoint = deproxy.addEndpoint(9999)

def theProxy = new TheProxy(port: 8080,
                            targetHostname: "localhost",
                            targetPort: 9999)

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource")

assert mc.receivedResponse.code == "200"
assert mc.handlings.size() == 1

How To: Auxiliary Service

A more complicated case is when the proxy has to call out to some auxiliary service for additional information.

 ________                         ________                        ________
|        |  --->  Request  --->  |        |  ---> Request  --->  |        |
| Client |                       | Proxy  |                      | Server |
|________|  <---  Response <---  |________|  <--- Response <---  |________|

                                   |    ^
                           Request |    |  Response
                                   v    |
                                  ________
                                 |  Aux.  |
                                 |Service |
                                 |________|

An excellent example would be an authentication system. The client sends the request to the proxy and includes credentials. In order to determine if the credentials are valid, the proxy makes a separate HTTP request to the auth service, which then responds with yea or nay. Depending on whether the credentials are valid or not, the proxy will either forward the request on to the server, or return an error back to the client.

In this setup, we can simulate both the server and the auxiliary service with DeproxyEndpoint objects. The endpoint representing the auth service would have to be given a custom handler, that could interpret and respond to the authentication requests that the proxy makes according to whatever contract is necessary.

Deproxy deproxy = new Deproxy()

def endpoint = deproxy.addEndpoint(9999)

def authResponder = new AuthResponder()
def authService = deproxy.addEndpoint(7777, defaultHandler: authResponder.handler)

def theProxy = new TheProxy(port: 8080,
                            targetHostname: "localhost",
                            targetPort: 9999,
                            authServiceHostname: "localhost",
                            authServicePort: 7777)

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource",
                             headers: ['X-User': 'valid-user'])

assert mc.receivedResponse.code == "200"
assert mc.handlings.size() == 1
assert mc.handlings[0].endpoint == endpoint
assert mc.orphanedHandlings.size() == 1
assert mc.orphanedHandlings[0].endpoint == authService

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource",
                             headers: ['X-User': 'invalid-user'])

assert mc.receivedResponse.code == "403"
assert mc.handlings.size() == 0
assert mc.orphanedHandlings.size() == 1
assert mc.orphanedHandlings[0].endpoint == authService

How To: Multiple Servers

Sometimes, a proxy might be set up in front of multiple servers.

                                                                  ________
                                   .------------> Request  --->  |        |
                                   |                             | Server1|
                                   |    .-------- Response <---  |________|
                                   |    |
                                   |    v
 ________                         ________                        ________
|        |  --->  Request  --->  |        |  ---> Request  --->  |        |
| Client |                       | Proxy  |                      | Server2|
|________|  <---  Response <---  |________|  <--- Response <---  |________|

                                   ^    |
                                   |    |                         ________
                                   |    `-------- Request  --->  |        |
                                   |                             | Server3|
                                   `------------- Response <---  |________|

This might be the case, for example, if it is acting as a load balancer, or providing access to different versions of a ReST api based on uri. This is simple enough to simulate by creating multiple endpoint objects, and configuring the proxy to forward client requests to them.

Deproxy deproxy = new Deproxy()

def endpoint1 = deproxy.addEndpoint(9999)
def endpoint2 = deproxy.addEndpoint(9998)
def endpoint3 = deproxy.addEndpoint(9997)

def theProxy = new TheProxy(port: 8080,
                            targets: [
                                ['hostname': "localhost", port: 9999],
                                ['hostname': "localhost", port: 9998],
                                ['hostname': "localhost", port: 9997]],
                            loadBalanceBehavior: Behavior.RoundRobin)

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource")

assert mc.receivedResponse.code == "200"
assert mc.handlings.size() == 1
assert mc.handlings[0].endpoint == endpoint1

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource")

assert mc.receivedResponse.code == "200"
assert mc.handlings.size() == 1
assert mc.handlings[0].endpoint == endpoint2

def mc = deproxy.makeRequest(url: "http://localhost:8080/path/to/resource")

assert mc.receivedResponse.code == "200"
assert mc.handlings.size() == 1
assert mc.handlings[0].endpoint == endpoint3

Routing

Even more complex situations can be created using the Route built-in handler, to route requests to existing servers.

 ________          ________          ________        ________
|        |  --->  |        |  --->  |  Fake  | ---> |  Real  |
| Client |        | Proxy  |        | Server |      | Server |
|________|  <---  |________|  <---  |________| <--- |________|

                    |    ^
                    v    |
                   ________
                  |Fake Aux|
                  |Service |
                  |________|

                    |    ^
                    v    |
                   ________
                  |Real Aux|
                  |Service |
                  |________|

This will work for requests from the proxy to an auxiliary service, or from the client/proxy to the server. It can save us the trouble of implementing our own handlers to simulate the server or auxiliary service. But why this round-about way of doing it? Why not just configure the proxy to send requests to those locations directly? The real advantage to this method is that the requests go through an endpoint, so the Request and Response get captured and attached to a MessageChain. When makeRequest returns, we can make assertions against those requests and responses, which would be entirely invisible to us if the proxy had sent them directly.

Deproxy deproxy = new Deproxy()

def endpoint = deproxy.addEndpoint(9999,
        defaultHandler: Handlers.Route("real.server.example.com"))

def authService = deproxy.addEndpoint(7777,
        defaultHandler: Handlers.Route("real.auth.service.example.com"))

def theProxy = new TheProxy(port: 8080,
                            targetHostname: "localhost",
                            targetPort: 9999,
                            authServiceHostname: "localhost",
                            authServicePort: 7777)

Endpoint Lifecycle

  1. When an endpoint is created, it opens a socket on the designated port and spawns a thread to listen for connections to that socket.

  2. Whenever a new connection is made, the listener thread will spawn a new handler thread.

  3. The handler thread will proceed to service HTTP request, like so:
    1. First the incoming request is read from the socket, and parsed into a Request object.
    2. The endpoint will examine the request headers for a Deproxy-Request-ID header, and then try to match it to an existing MessageChain (created before in a call to makeRequest).
    3. The endpoint will then determine which handler to use (see Handler Resolution Procedure), and pass the Request object to the handler to get a Response object.
    4. If there is a MessageChain associated with the request, a Handling will be created and attached to the message chain. Otherwise, it will be attached to the orphanedHandlings list of all active message chains.
    5. The endpoint will then send the response back to the sender.
    6. Finally, if the handler indicated that the connection should be close (by setting the Connection header to close), then the endpoint will exit the loop and close the connection. Otherwise, it will return to step a. above.
  4. When shutdown is called on a parent Deproxy object, all of its endpoints will be shutdown. Their listener threads will stop listening, and no longer receive any new connections. Any long-running handler threads will continue to run until finished or the JVM terminates, whichever comes first.

Deproxy is a feature-rich tool for comprehensive functional testing of HTTP applications.

Table Of Contents

Related Topics

This Page

Fork me on GitHub