If you could buy an artifically intelligent robot, would you want it to have some measure of free will? Would you want it to be able to override your commands for your own good? If you had a collection of artificially intelligent army man robots, would you want them to have free will? If you owned an artificially intelligent painting, would you want it to have the ability to surprise you with a new look?
I'd need to address the moral question first of all. Should I be able to pay money for an inteligent being and should I have the choice to subjugate it's will? There's a lot of unknowns in this hypothetical. How for instance is artificial intelligence acheived? So often I've found myself with many things that I wanted to but not enough time for them all. What if I woke up one day and found that I was a cloned intelligence uploaded to a computer and there was a real me out there? I think I might me more cool with it if I knew that my memories could be reintigrated back into meat space after I'd finished writing those letters, booking those appointments, paying those bills and sending those invoices.
I am really intimidated by having an AI robot in my space. What if it decides it doesn't like me and tries to turn the other appliances against me in a coup?