"Could an autonomous car be programmed to do that, or would that be morally wrong?"
Morality doesn't enter into it. An autonomous vehicle will be programmed to slow and stop if it senses an object - moving or otherwise - in its path. A calculation will take place, using speed and distance to impact. If there is time to slow without stopping the car will do so, allowing the driver to make a decision about further action. If there isn't time to allow the driver to decide, the car will stop. If it can't stop safely in the distance, it will try to take avoiding action, but it will not mount a pavement or steer into the path of oncoming vehicles in doing so.
Drivers are responsible for the vehicle at all times - saying that your car made the wrong decision in a given set of circumstances would not be a defence in court. Autonomous vehicles are not intended to make the driver entirely redundant - that wouldn't be possible at the current stage in the development of this technology.