Is it true that doctors lie about your diagnosis to make money?

Do doctors lie? You can’t trust them right? Or they are right because they can get in trouble if they lie right? Like for example they give you a diagnosis to make you appear that you have that certain disease when you really don’t to make more money. Is that possible or not usually?