Volkswagen didn't make a faulty car: they programmed it to cheat intelligently. The difference isn't semantics, it's game-theoretical (and it borders on applied demonology).
Regulatory practices assume untrustworthy humans living in a reliable universe. People will be tempted to lie if they think the benefits outweigh the risks, but objects won't. Ask a person if they promise to always wear their seat belt, and the answer will be at best suspect. Test the energy efficiency of a lamp, and you'll get an honest response from it. Objects fail, and sometimes behave unpredictably, but they aren't strategic, they don't choose their behavior dynamically in order to fool you. Matter isn't evil.
But that was before. Things now have software in them, and software encodes game-theoretical strategies as well as it encodes any other form of applied mathematics, and the temptation to teach products to lie strategically will be as impossible to resist for companies in the near future as it has been to VW, steep as their punishment seems to be. As it has always happened (and always will) in the area of financial fraud, they'll just find ways to do it better.
Environmental regulations are an obvious field for profitable strategic cheating, but there are others. The software driving your car, tv, or bathroom scale might comply with all relevant privacy regulations, and even with their own marketing copy, but it'll only take a silent background software upgrade to turn it into a discrete spy reporting on you via well-hidden channels (and everything will have its software upgraded all the time; that's one of the aspects of the Internet of Things nobody really likes to contemplate, because it'll be a mess). And in a world where every device interacts with and depends on a myriad others, devices from one company might degrade the performance of a competitor's... but, of course, not when regulators are watching.
The intrinsic challenge to our legal framework is that technical standards have to be precisely defined in order to be fair, but this makes them easy to detect and defeat. They assume a mechanical universe, not one in which objects get their software updated with new lies every time regulatory bodies come up with a new test. And even if all software were always available, cheking it for unwanted behavior would be unfeasible — more often than not, programs fail because the very organizations that made them haven't or couldn't make sure it behaved as they intended.
So the fact is that our experience of the world will increasingly come to reflect our experience of our computers and of the internet itself (not surprisingly, as it'll be infused with both). Just as any user feels their computer to be a fairly unpredictable device full of programs they've never installed doing unknown things to which they've never agreed to benefit companies they've never heard of, inefficiently at best and actively malignant at worst (but how would you now?), cars, street lights, and even buildings will behave in the same vaguely suspicious way. Is your self-driving car deliberately slowing down to give priority to the higher-priced models? Is your green A/C really less efficient with a thermostat from a different company, or it's just not trying as hard? And your tv is supposed to only use its camera to follow your gestural commands, but it's a bit suspicious how it always offers Disney downloads when your children are sitting in front of it.
None of those things are likely to be legal, but they are going to be profitable, and, with objects working actively to hide them from the government, not to mention from you, they'll be hard to catch.
If a few centuries of financial fraud have taught us anything, is that the wages of (regulatory) sin are huge, and punishment late enough that organizations fall into temptation time and again, regardless of the fate of their predecessors, or at least of those who were caught. The environmental and public health cost of VW's fraud is significant, but it's easy to imagine industries and scenarios where it'd be much worse. Perhaps the best we can hope for is that the avoidance of regulatory frameworks on Internet of Things won't have the kind of occasional systemic impact that large-scale financial misconduct has accustomed us to.