Auto >> AutoSPT >  >> Car Care

Do car dealerships have to give you a warranty?

No, car dealerships are not required by law to provide a warranty on used vehicles. However, many dealerships do offer warranties as an added benefit to customers. The terms and conditions of these warranties vary from dealership to dealership, so it is important to read the warranty carefully before purchasing a vehicle.