A fundamental missing focus in there is that science and technology are chaotic processes. There are organized funders and efforts of course (NIH, Manhattan Project, Plan Calcul...) but most people "go into" tech or science merely because of their own interests or curiosity, or because "it's what they happen to be good at". And much progress is more or less dumb luck: "they were looking for something else and they noticed this". Much of a researcher's or engineer's work is about keeping their eyes open and being there.
So that most likely there are already AGI efforts in every direction. And these directions themselves do not say all that much about what will emerge as most important. And AGIs aimed at scientific research better be trained on noticing the unexpected (deliberately so). So that some will follow human obsessions, and some will not. Some will align with population level concerns and some will serve individual masters and concerns. And we better be ready and understand that some will not (and how they might not) - follow "human-ish" concerns. Quite possibly usefully so, from a science advancement point of view. All the while understanding that they all will still happen.
So that most likely there are already AGI efforts in every direction. And these directions themselves do not say all that much about what will emerge as most important. And AGIs aimed at scientific research better be trained on noticing the unexpected (deliberately so). So that some will follow human obsessions, and some will not. Some will align with population level concerns and some will serve individual masters and concerns. And we better be ready and understand that some will not (and how they might not) - follow "human-ish" concerns. Quite possibly usefully so, from a science advancement point of view. All the while understanding that they all will still happen.