Indeed, this is exactly what employer-installed root certificates are typically used for: to facilitate man-in-the-middle inspection of encrypted packets (usually in order to enforce network usage policies)! The root certificate that’s installed and marked as trusted is one with “Certificate Authority” purposes: i.e. it’s trusted for the purpose of signing other certificates, and then the man-in-the-middle tool can generate certificates as-required to intercept anything it likes.

There are a number of countermeasures to this, yes. Probably the most-effective countermeasure would be for the app to require that the certificate (or, more-likely, the upchain root signing certificate) has a particular fingerprint: the fingerprint is derived from the private key and so cannot be meaningfully spoofed. If the app requires that the certificate itself has a particular fingerprint then this is the most-brittle approach, because certificate changes/addition must be accompanied by an update to the app, but contemporary app store mechanics mean that this isn’t unfeasible. Demanding that the root certificate is one of a pre-validated set is exactly what most browsers these days do for Extended Validation (EV) certificates, by the way: so when a web browser (other than IE) shows you a “green address bar”, then you can be sure that the certificate was signed by one of the standard “trusted” root certificates included by that browser vendor: your employer can’t fake this unless they supply their own custom build of your browser (unless they have the support of a CA). However, that doesn’t help much if the user doesn’t know to look for an EV certificate on a particular site in the first place, because there’s no reason an intercepting proxy couldn’t simply add a non-EV one in its place when it intercepts.

Yes, an application could introduce another level of encryption on top, but that’d mostly be an example of “security through obscurity”, because an intercepting proxy targetting that specific application could (probably) still break-in. If it’s using asymmetric encryption of any kind then the same principle applies: if the fingerprint of the certificate (or the signing certificate, more-likely) isn’t verified, it can be spoofed by a proxy. Symmetric encryption doesn’t have this problem, but introduces a different one in its place: the cryptographic keys must be stored within the app itself and so can be extracted by reverse-engineering. This might provide a barrier to casual interception, but won’t prevent a determined attacker.

The correct approach, then, is fingerprint comparison – the same as browsers do (for EV certificates, at least). I’ve tried a dozen or so apps on my phone, though, and virtually none of them do this, so you chould consider that your mobile apps are as vulnerable as your web browsing is if you install a third-party CA certificate onto your device! If you’ve been asked by your employer to add a certificate to your trusted store it’s worth checking whether it’s one that’s permitted only to verify the identity of a particular domain or set of domains (which is probably okay) or whether it’s one that’s capable of signing other, arbritrary certificates (which is pretty alarming!).

(There are, of course, user-driven countermeasures too, like use of a “known good” VPN or SSH tunnel, but your question was about what app developers can do.)