I have two windows 2008 R2 servers in a farm. I also have another 2008 R2 server functioning as a gateway/connection broker. I am in the process of moving some of the apps and roles to some 2012 server as well. I am using a 2012 server in the RDWEB/Gateway role along with another 2012 server running some remoteApps.
Here's what I am a little confused about. I have the 2008 R2 machines pointed to the 2012 server to publish their apps and the 2012 server has the necessary change to the xml file so that this works well. Users log into the web site, click on an application and it routes it to one of the two 2008 R2 servers which kick the request back to the Connection Broker which in turn assigns the session back to one of the two servers. In the event the user chooses a 2012 app it gets routed to that server instead. All as expected
Also included in this setup are two Gateway servers. One is 2012 and the other is 2008 R2. The 2008 session hosts point to the 2008 Gateway server and the 2012 session host to the 2012.
Here's what I don't understand. I rebooted the 2012 RDWEB server yesterday and the people connected to the terminal apps on the all three servers lost their connections until it came back up. I was assuming that the web site presented the user with apps which when run would hand off the connection to the Gateway server to complete the connection and manage it's state. Clearly that's not what's going on here as, if it were, the connection would survive rebooting the web server. What am I missing here?
TIA
Jack