Should development of advanced AI be paused so its risks can be properly studied? Why or why not?

recent post remarked that "An open letter signed by hundreds of prominent artificial intelligence experts, tech entrepreneurs, and scientists called for a pause on the development and testing of AI technologies more powerful than OpenAI’s language model GPT-4 so that the risks it may pose can be properly studied." Do you agree?  Why?