MAX_JOBS_RUNNING = 2000
 {endcode}
 
-5.   Under Windows, you <em>might</em> be able to increase the maximum number of jobs running in the schedd, but only if you also increase desktop heap space adequately.  The problem on Windows is that each running job has an instance of condor_shadow, which eats up desktop heap space.  Typically, this heap space becomes exhausted with on the order of only ~100 jobs running.  See {link: http://www.cs.wisc.edu/condor/manual/v7.0/7_4Condor_on.html#SECTION008413000000000000000 My submit machine cannot have more than 120 jobs running concurrently. Why?} in the FAQ.
+5.   Under Windows, you _might_ be able to increase the maximum number of jobs running in the schedd, but only if you also increase desktop heap space adequately.  The problem on Windows is that each running job has an instance of condor_shadow, which eats up desktop heap space.  Typically, this heap space becomes exhausted with on the order of only ~100 jobs running.  See {link: http://www.cs.wisc.edu/condor/manual/v7.0/7_4Condor_on.html#SECTION008413000000000000000 My submit machine cannot have more than 120 jobs running concurrently. Why?} in the FAQ.
 
 6.   Put a busy schedd's spool directory on a fast disk with little else using it.