a, b, c, d, e, f, g ……
Some information needs to be memorized: words, basic mathematic principals, etc. Many other forms forms of memorization are rapidly becoming unnecessary. When was the battle of little big horn? What day did Lewis & Clark make it to the Pacific? Questions a five second Google search could answer, yet too often menial factoids are used as the basis for identifying a persons intelligence. Standardized tests rely on memorization; for students to be successful in the current system they must memorize and regurgitate. Can the act of learning itself suffice for the student and the system? Or must the student show competency through a standardized methodology? If a standardized methodology must be used, can it be tailored to the multiple identified learning styles students use? To what end does wrote memorization accomplish?
Right now humanity has unprecedented access to knowledge on a global scale — beyond anything it has known before. If students can leverage ideas, understand concepts, and critically think; are these skills not more important than the myopic memorization of common knowledge? As it stands, memorizable knowledge is readily accessible from a few taps on a smartphone keyboard. Soon typing will even seem archaic; in the near future knowledge will become more integrated and accessible to humans though voice and bionic circuits; it isn’t beyond imagination that within the next decade humans will be able to search and learn from a device embedded within their mind. While it’s arguable if computers can think better than humans, they can provably perform simple calculations; store, access, and retrieve information; and a myriad of programmable tasks better than humans.
Humans exceed at critical thinking. Why, then, do we as a society train humans to do menial tasks computers can do infinitely better?