OpenAI’s massive GPT-3 model is impressive, but size isn’t everything
http://feedproxy.google.com/~r/venturebeat/SZYF/~3/l0IhKmKAssA/
Last week, OpenAI published a paper detailing GPT-3, a machine learning model that achieves strong results on a number of natural language benchmarks. At 175 billion parameters, where a parameter affects data’s prominence in an overall prediction, it’s the largest of its kind. And with a memory size exceeding 350GB, » ….