The way elderly care is delivered is changing. Attempts are being made to accommodate the increasing number of elderly, and the decline in the number of people available to care for them, with care robots. This change introduces ethical issues into robotics and healthcare. The two-part study (heuristic evaluation and survey) reported here examines a phenomenon which is a result of that change. The phenomenon rises out of a contradiction. All but 2 (who were undecided) of the 12 elderly survey respondents, out of the total of 102 respondents, wanted to be able to change how the presented care robot made decisions and 7 of those 12 elderly wanted to be able to examine its decision making process so as to ensure the care provided is personalized. However, at the same time, 34% of the elderly participants said they were willing to trust the care robot inherently, compared to only 16% of the participants who were under fifty. Additionally, 66% of the elderly respondents said they were very likely or likely to accept and use such a care robot in their everyday lives. The contradiction of inherent trust and simultaneous wariness about control gives rise to the phenomenon: elderly in need want control over their care to ensure it is personalized, but many may desperately take any help they can get. The possible causes, and ethical implications, of this phenomenon are the focus of this paper.