I got the overall sense that the presentations were very much a work in progress. For example, I watched the in-depth Android Auto presentation, and they went into loads of detail on how to integrate with the music player, but nothing whatsoever on how you might do a full-screen app (say, a competing nav system). I got similar sentiment from watching other presentations, where the uniform message was "here's all the cool and awesome stuff that we're doing. APIs aren't final. We're still working on it. Come on in and let us know how it works for you."

With respect to the questions about IR remotes and whatnot, there's clearly a broader universe in which they're hoping to put contextually relevant controls (e.g., Tony's roommate's desired pause button) everywhere you might want them. On your watch. On your phone. On the Android TV d-pad remote. Etc. Given that Google TV boxes also know how to be IR Blasters for TiVos and such, there's clearly an attempt in the Android codebase to have all-singing-all-dancing-all-routing-to-the-right-destination remote control support. This suggests that an IR receiver connected to something akin to an Arduino board could probably do what you want. Or, who knows, maybe HDMI-CEC is part of the Android all-singing-all-dancing future.

Meanwhile, back to the Android Auto stuff, they went on about how they've dealt with tons of legal bureaucracy in order to make sure that everybody's music player meets appropriate requirements (e.g., the ability to operate the system while wearing glasses that black out your vision and give you a limited 1.5 second window in which to execute a task before going black again).

So long as your music app is built with their APIs (wherein you're basically just skinning the official app), you're good to go. Beyond that, hello insane regulations. Curiouser and curiouser.