It is relatively easy to give an enumeration for all functions calculated at polynomial time: just give an enumeration pi_j for all polynomials, for j to N, and then consider for any pair (i, j) a machine that works like a Turing Machine M_i with a clock pi_j. Any function that can be calculated in polynomial time can be expressed in this way.
This is very different from the problem, in order to understand if the universal Turing machine works in polynomial time, which is not solvable. The proof is not trivial, since it is not an extensional property of programs, and we cannot apply Rice's theorem. You can find the proof in my article Intensive Content of Rice's Theorem, POPL 2008 (pearl).
The problem of providing syntactic characteristics of subrecursive complexity classes, such as P, Pspace, etc., has received much attention in the literature. The recent area of Implicit Complexity is aimed at studying the computational complexity of programs without reference to a specific machine model and explicit restrictions on time or memory, but instead relying on logical or computational principles that entail properties of complexity, usually through controlled use available resources. To familiarize yourself with this topic, you can refer to the special issue of Implicit Computational Complexity, ACM Trans. Subtract. Journal., Volume 10, n.4 2009.
Other interesting characteristics were obtained that limited the interpretation of programming languages to finite areas. The original Result of this area is Gurevich’s old work (“Algebras of Admissible Functions”, FOCS 1983), where he proved that the interpretation of primitive recursive functions (respectively recursive functions) on finite structures accurately obtains the log space (respectively polynomial time) of computable functions.
Please take a look at my article, Computational Complexity Through Finite Types, ACM Trans. Subtract. Log., Vol. 16, n.15 2015, for additional references in this area.