Antenna elevation question

We are about to build a new tower and I’m a little bit confused about the height calculation to reach our most remote customers about 25 miles away.
The tower sits on flat desert terrain a few feet above sea level. Our customers are all on the same flat terrain. The formula I read calls for the square root on the height multiplied by 1.415 to determine the radio horizon in miles. It also shows an example of a tower elevated 30ft above ground to reach a radio horizon of 7.75 miles.
Does a linear approach apply here? Say if I want to reach three times 7.75 miles, I need to elevate my tower three times 30ft?

Please advice. Thanks

I got the answer to this question. Linear approach will not work on a rounded object (earth).
So I guess the only remaining question is that if I’m going for lower to higher elevation, I can just simply deduct the elevation difference form my antenna height?