EASY QUESTION EASY POINTS Is there any mandatory test people in the United States take in order to become educated or find a job? (Hint: think high school tests)

Respuesta :

Answer:

I think you mean the SAT and/or the ACT (but they are not mandatory). If this helps please rank Brainliest. Thank you.