I guess that's true, but in practice XSS is a much bigger threat to your application than someone MITM'ing an HTTPS connection to Google's (or whomever's) CDN.
If you don't allow external scripts to be modified, why host them externally at all? Why not just wget them and host them locally alongside the checksum document and skip all this silliness?
Oh, also, those scripts can themselves load in other scripts you haven't checksummed.
Because you might be using a CDN for performance and bandwidth benefits.
If you relying on a third party piece of code that you allow to change at any time, then it is very difficult to do any release testing to give you a known set of conditions your application should work under.
In what scenario do you want external scripts to be modified? Why not take advantage of their ability to serve the scripts while also verifying that they are the same scripts you expected to have? You can also verify that those scripts do not load any other scripts in the version you have. Then, if it's changed later to load more scripts, you'll know about it.
How is checking the validity of the scripts that run on your site madness??
It's... kinda madness. Just to be clear that we're talking about the same thing, here's the proposed process as I understand it:
1) load your loader script, which has the URLs, fingerprints of the scripts you want to run, and the necessary dependency information (jquery-ui must load after jquery for example). In the best case, this file is being served out by the same server that's hosting the HTML, that way at least you're not adding more attack vectors.
2) from the loader script, initiate ajax requests for each of the remote files you need
3) as you get each one back, validate that its signature matches those that are expected, raise an exception if it does not (ideally also displaying something to the user), and evaluating it if its signature matches and we've loaded all of its dependencies.
So, why is this madness?
1) Most of the time the reason that you're letting a third party host these files is for speed. They've got a CDN, and hopefully the file will already be cached by your user. Grabbing resources with javascript that you could load directly in the html will slow down your page's loading time, as the browser's html parser isn't able to look ahead and fetch resources that are likely to be needed before the renderer has asked (HTML has a defined rendering order that can be kinda strict sometimes, this is the same reason why you don't put your <script> elements in the <head>).
2) Another reason for using a CDN for your JS libraries is convenience, which this process also wipes out.
3) The whole thing won't work at all unless the third party server sends back cooperative CORS headers, as you can't do an ajax request to a third party site without their cooperation.
Finally though – and this is the big one – it's more convenient for the developers, strictly safer, and faster for the end user if you just compile all of the JS and serve from the same domain that's serving your HTML. As stated above, if that server is compromised, you're toast anyways (barring a browser extension or similar). If you really want some more security, look into SSL (and actually look into it, there's definitely much better and much worse ways of doing it).
Most of what you're describing as "madness" is already done in head.js and require. They have no particular speed penalties and handle dependencies better than just putting script tags in the right order. The one difference is that a system like this would check the hashes to verify the code.
The one possible catch, as you mention, would be getting access to these scripts before they are loaded without having cross-origin problems.
There are a number of problems with serving from your own domain. It is, in fact, much less convenient for developers, as it adds an extra step to the build process and requires the system to properly handle caching so that old resources are not still served after a build. It is also slower to serve from the same domain as there are connection limits. Lastly, it gives up all advantages of a CDN.
My proposal is an attempt to continue taking advantage of CDNs and third party resources, but without giving them the keys to your site. Did you ever consider that Google has access to all of your users' cookies, if they wanted to add a small modification to jQuery or Analytics? Considering recent revelations about government involvement, is it really out of the question to believe that they never would take that information?
head.js and require do have significant speed penalties unless you're just using them as for tracking dependencies and for developing locally. It may be the right tradeoff of effort vs performance for some projects to leave this going even in production, but there's nothing to gain in denying the huge performance boost you're leaving on the table by not compiling your js.
I'll try to extract the core of my argument. The hashing proposal is madness for two reasons: 1) it's slower and less secure than just serving all of your js in one file; 2) it will not actually work without the cooperation of a CDN.
1) The proposal requires you to have one trusted server that you're serving javascript resources out of (because you need to load the script loader and fingerprints from there). If you want fast and secure, you've already paid the cost of a round trip to server #1, and the risk of trusting server #1. The sane thing to do from a performance and security standpoint is to load all of the javascript that you can in that request. Otherwise you're going to be blocking on that request returning, then the renderer reaching that script's location in the html, then that script being executed before it fires off the requests.
2) I'll phrase this as a challenge. Try to load jquery from a CDN with an ajax request. Remember, the key is to get the source of the script into memory without executing it, so that you can hash and validate it first. Feel free to try it right now in your developer console, I'll even give you a code snippet to start from:
url = '//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js'
var request = new XMLHttpRequest();
request.open('GET', url);
request.send();