What cultural/societal affects will women being better educated than mean cause?
Within the United States women have been pursuing higher education and careers centered around college education, how will this affect American culture/society in the future?
https://spartanshield.org/42176/feature/its-a-girls-world/
https://aibm.org/research/male-college-enrollment-and-completion/
In your opinion how will this change society/culture, and what do you predict will result from this?