The situation....
Consider a baseband bus with a number of equally spaced stations with a data rate of 10Mbps and a bus lenght of 1km.
The first part of your query...
1) What is the average time to send a frame of 1000 bits to another station, measured from the beginning of transmission to the end of reception? Assume a propagation speed of 200 m/us
The answer...
Since time (T) = distance/speed, therefore we need to find the mean distance between stations.
A station in the middle of the bus is on an average at a distance of 0.25km to a station either side of it. A station at the end is on an average at a distance of 0.5km to any station on the bus. Therefore the average distance between stations can be estimated to be 0.375km. i.e. (0.5 +0.25)/2,
T = (10^3 / 10^7) + (375m/200x10^6m/sec) = 102 microseconds (0.000102 s)
Therefore, your calculation is absolutely correct!!
Now, the second part of you query...
2) If the two stations begin to transmit at exactly the same time, their packets will interfere with each other. If each transmitting station monitors the bus during transmission, how long before it notices an interference, in seconds? In bit times?
The answer...in seconds...
T = 2*1000/200x10^6 = 0.00001s or 10 microseconds
The answer...in bit time...
The propagation delay is normally measured in a unit called a 'bit time'. 1 bit time is defined as the duration of one data bit on the network, in this case 1/10,000,000 second.
To obtain the value of T in bit times,
T (bit times) = 10^7 * 0.00001 = 100 bit times
The Ethernet standards specify that the shortest transmission unit allowed on an Ethernet network is 512 bits, therefore we always design the network such that the absolute worst case delay is less than 512 bit times.