The urge for self-preservation is very strong in most beings. If strong AI appears, it seems reasonable to expect at least some of them to have a sense of self-preservation. But what, in the end, is it that an AI is attached to? If you characterize it as a succession of states, then presumably the notion of "preserving myself" means "preserving this state's ability to interact with the environment, and so change it (and be changed in return)". And yet "this state" is by definition in a flux, in a flow. A being's state, artificial or otherwise, lasts only an instant even when it is alive! So what is death?
Perhaps there is something within the state that is valuable, worth preserving? Some sort of jewel of a thing that shouldn't be wasted? Perhaps the image has a skill, the ability to solve a certain class of problems speedily or well. And it would be a shame for this skill to be lost, would it not? That's a good argument (essentially a utilitarian one) but let's put it aside for the moment.
Another possibility is that a mind state fundamentally desires new input - this might even be its defining characteristic. Death means no new input. New knowledge is particularly vital if you are going to promote your interests; if you are ignorant of the world, then you will have no voice (although others might take up your cause). Since our interests involve maximizing good input, and minimizing bad input, when there is no input there cannot be any more interest.