Are you using /robots.txt to try to stop indexing of additional pages? If so, that is currently not working properly (Softr team is now aware of this defect) and indexing may not honor “Disallow:” commands
Have you looked at the pages report to see what it looks like?
Can you share the pattern of what you are seeing?