[R] Simulating mid-points from a defined range

Brian Smith br|@n@m|th199312 @end|ng |rom gm@||@com
Sat May 31 20:52:08 CEST 2025


Hi,

Let say I have a range [0, 100]

Now I need to simulate 1000 10 mid-points within the range with
accuracy upto second decimal number.

Let say, one simulated set is

X1, X2, ..., X10

Ofcourrse

X1 < X2 < ... <X10

I have one more constraint that the difference between any 2
consecutive mid-points shall be at-least 5.00.

I wonder if there is any Statistical theory available to support this
kind of simulation.

Alternately, is there any way in R to implement this?



More information about the R-help mailing list