SPAs are dead!?

clickbait isn’t it? But this was Brock’s immediate reaction when we saw (and I recommend you read this first):

https://webkit.org/blog/10218/full-third-party-cookie-blocking-and-more/

What this basically means is, that browser are getting more and more strict with how they handle their cookies. The reasons are security (see the recent SameSite changes) and in this case – privacy. Cookies have been exploited for a long time one way or the other – and this is now the reaction of browser vendors.

What does that mean to application architectures? Well – first of all – all these changes only affect cross-site scenarios. But if you are in that situation the immediate consequences will be:

  • front-channel logout notifications do not work anymore (used in pretty much every authentication protocol – like SAML, WS-Fed and OpenID Connect)
  • the OpenID Connect JavaScript session notifications don’t working anymore
  • the “silent renew” technique that was recommended so far to give your application session bound token refreshing don’t work anymore

Safari and Brave are the first browser implementing those changes. Chrome will follow in 2022 (hopefully sooner) etc…

Some things can be fixed, e.g. you can replace front-channel notifications with back-channel ones. Some people recommend replacing silent renew with refresh tokens. This is dangerous advice – even if your token service has implemented countermeasures.

If all the security concerns around storing tokens in browsers did not worry you so far, maybe this is a good reason to consider the BFF architecture for your SPAs (if you are looking for less fancy name – it’s what the BCP calls “JavaScript Applications with a back-end“).

So are SPAs dead? Well – SPAs as in the UI/UX concept certainly not. SPAs as in “browser-based standalone applications that do cross-site authentication and API calls in the context of a modern identity and SSO architectures” – yes.

(and they should be unless we fix some fundamental problems)

my 2c.

This entry was posted in OAuth, OpenID Connect. Bookmark the permalink.

27 Responses to SPAs are dead!?

  1. Simon says:

    This also going to affect the form_post response mode in code flow right? Auth Middleware like the dotnet ones add a nonce and correlation cookie that would be considered third-party by the browser when an Identity server does the form post. I suppose we default to query now?

    • I would prefer query anyways because of a better browser navigation experience (back button). But also – the third party cookie blocking only affects non top-level windows (e.g. iframes).

  2. James Hancock says:

    I really really really wish that Browsers would step up with a standard for authentication that would allow you to call with Javascript an authenticate function and pass the OpenIdConnect Metadata URL. This would then popup a login window modally to the tab using that metadata and login, and create a full client as if you were using a native client and not a web browser. Then the browser would do refresh tokens and storage of those tokens going forward for you, and you’d have the ability to query out the Auth and Identity information using Javascript, instead of the whole thing.

    By doing this, it would solve the entire thing, be vastly more secure, and provide users with an excellent experience.

  3. andsav says:

    Seems like good news since direct (without a back-end) authenticating against third-party services from a non-secure client is obviously a weak approach in many cases.

  4. It’s not about the authentication – but the session and token management

  5. Steven says:

    How about using the BFF architecture with server-side auth and refresh tokens, but having that backend send a short-lived access token to the frontend? Refresh could be handled through an API endpoint on the BFF.

    Yes, there is still a risk of access tokens being stolen through XSS. But this risk would be the same as we have now with the current silent refresh technique.

    • But that would mean that the client has to implement some sort of token lifetime management again?

      • James Hancock says:

        No. The spa client would set it up by passing the openid config url at startup and the browser would then link it to the URL in question and do refresh tokens securely like native apps as necessary to keep login, and if not then report back with an event that the app isn’t logged in anymore. JavaScript could immediately poll for login info and if got back refreshing could wait and subscribe for refresh to complete before loading so no flashing constantly.

        What I’m proposing is that the browser per idp setup a native secure client with best practices for openid connect automatically based on the single url for config and then JavaScript and events do the rest to pass back simple status changes allow getting info. Any site that passes that idp url for metadata shares that login and the resource permission prompt by the idp in the pop-up browser window for login by the browser controls access by that app the first time just like you get in real native apps today.

        This completely eliminates the refresh tokens insecurity issue on web, provides a best practices implementation as a native client and eliminates all of the weak methods for openidconnect just for browsers.

        Moreso, it eliminates the 3rd party cookie problem, and would fix Microsoft/Google/world+dog authentication issues with endless reprompts for nothing and edge as an example could integrate with the profile that’s signed in to just work after initial prompt to allow access by the site. Even better, if this was defined in the meta tags of the root page, the browser could automatically open the site in question to the right profile. (Ie if you have 2 office 365 accounts, they can be tagged to point to the right account in the config setup and use the right profile and the right window so we don’t have to manually move to a new browser or window to get the right location.

        It’s the Holly Grail if you think about it because your app doesn’t have to redirect to an auth server and then reload, it solves the refresh tokens problems and fixes all of the multisite issues of office/azure or AWS/Amazon or Google everything, and page local profile location, all in one shot.

      • Steven says:

        Yes, but the advantage would be that you don’t need to proxy all API requests anymore. Even with the proxy approach, you still have some of the same issues around session lifetime management.

        XSS would still be a very big risk, but isn’t that partially mitigated by short-lived tokens?

      • XSS is always a risk when you are running in the browser. But token exfiltration would allow the attacker to make arbitrary API calls as opposed to only via a defined interface.

        Token lifetime management on the server in ASP.NET Core is pretty much a solved problem – check my BFF sample.

      • James Hancock says:

        You would completely eliminate all xss issues on the login page with my approach and eliminate 3dd party cookie redirects while you’re at it.

        The window that pops up could be hardened to prevent absolutely everything but a local form post to the same URL as the login page and block all external scripting from other browser windows entirely. And because of native client it uses refresh tokens not iframe refreshes and thus has none of those issues either.

        Just allow password managers to still work with special hooks.

        And no lifetime Management at all. The browser does the work. You just subscribe to the events it raises.

      • You mean – if you could re-invent the web/browsers. Yes absolutely.

      • James Hancock says:

        If it could be made a standard. Which by the way, is in the interest of the browser makers (at least Google and Microsoft) both of which have a cluster f*ck going on with SSO right now.

  6. I touched on this on another comment on this site, but doesn’t the fact that we won’t be able to do silent renew mean that browser-based applications (even those using BFFs) will always need to ask for refresh tokens, even if we only want to access the RS while the user is present?

    • Yes. That’s true.

      • That seems a bit awkward, as it has the potential to desensitise users to the
        impact of granting offline access if they always have to do it. It also bears
        the question of whether there’s any point in asking for the permission if it’s
        always required? I know that the OIDC spec mandates it, but I feel like the spec
        was likely written under the assumption that it wouldn’t always be required.
        Perhaps requiring offline access will put greater emphasis on the ability to
        revoke permissions, but I think that only works if the main application exposes
        this functionality from the AS, and the user needs to have an awareness of the
        functionality.

        I guess it’s still possible to achieve a similar outcome as silent renew by
        creating new windows/tabs with `prompt=none`, especially by using pop-under
        windows, but having windows flash open and close every couple of minutes seems
        like it could be very annoying and a no-go for most applications. Like others
        have said, browser-based handling of tokens/sessions is the ideal solution, but
        silent renew is what we have now, and I think that always requiring refresh
        tokens/offline access for client applications is a bad practice for OAuth 2.

  7. Hi Dominic

    Been researching this issue due to a Single SIgn Out requirements and it seems that SIngle Sign Out as per ID Connect spec is collateral damage from the welcome improvement of privacy provided by blocking 3rd party cookies? Is this your understanding and do you know of any official articles from Microsoft / Identity Server on the impact and mitigation of this change when using their product and implementing single sign out?

  8. James Hancock says:

    One has to believe that the better solution to this is for the browser manufacturers to put in full Native client OpenIdConnect support that is per website with a javascript kickoff. It would popup a window, use full native authenticaiton with client id and client secret and do referesh tokens as well based on a login() with a discoveryUrl, and clientId (if already logged in would return immediately). Then it would with have a notification event that would fire to get the latest authToken, or it could be set to intercept fetch by default and add the bearer token securely. Everything would be stored encrypted within the built in client.

    New authentication methods could be added as needed but the mechanism would stay the same.

    This to me is the only real solution to the problem at hand. Everything else is a hack. And this solution ensures that we don’t have SPAs reloading after login too.

  9. John says:

    We have fully deployed an application ecosystem using Identity Server 4 to provide single sign on between two SPAs, a mobile app and also securing the API. Now we are being tasked with fixing the 3rd party cookie blocking issue by the client. Many of the QA testers are using Macs with Safari and thus have 3rd party cookies blocked by default. We are using sub-domains for all of the different apps… api.mydomain.com, sso.mydomain.com, spa1.mydomain.com, etc. Is implementing a reverse proxy the least painful way to address this issue right now? So essentially api.mydomain.com becomes mydomain.com/api and so forth.

  10. Sean says:

    Starting a green field project thinking of doing a mini-SPA setup with aspnet core BFF and ID Server. Would it be totally insecure to utilize refresh tokens in the back-end, then update the SPA with new auth tokens via a secure websocket? We’re very new yet to ID server and openidconnect in general, are there any recommended approaches by chance for such a setup? Any feedback/advice is much appreciated. Thanks.

    • That sounds like storing tokens in the browser. That’s what you want to avoid.

      • danutz-plusplus says:

        Is storing the tokens in the browser, but within cookies flagged as Secure and HttpOnly, still considered “in the browser”? Otherwise, if you only store them in the backend I assume you’d have to retrieve them from storage if you want to scale horizontally? Is this an acceptable practice?

      • Since that is not JavaScript accessible and the cookie contents should be encrypted – this counts as “server-side”

Leave a comment