I haven't bought a car for several years but have recently returned to the market. It seems to be a thing that before taking a car out for a test drive you have to sign an insurance agreement to indemnify the dealer for the first £000s of any damage caused to the car.
When did this become a thing? So far I have declined to sign these documents and in every case the salesman has allowed me to take the car anyway!
This situation has occurred across many different dealerships so not specific to one particular make.
Is this situation normal?
|