Step 1: Recall definition of returns to scale.
- Constant returns to scale: Output increases proportionally to inputs, i.e., \(f(tx_1, tx_2) = t f(x_1, x_2)\).
- Increasing returns to scale: Output increases by more than proportionally, i.e., \(f(tx_1, tx_2)>t f(x_1, x_2)\).
- Decreasing returns to scale: Output increases by less than proportionally, i.e., \(f(tx_1, tx_2)<t f(x_1, x_2)\).
Step 2: Apply to question.
The question pertains to decreasing returns to scale, thus the defining condition is: \[f(tx_1, tx_2)<t f(x_1, x_2)\] Final Answer: \[\boxed{f(tx_1, tx_2)<t f(x_1, x_2)}\]