A computer takes 3x^2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?