Use dbt, host it on whatever they got (vSphere, cloud VMs, w/e) or schedule it the same way their "Devops" or "CICD" pipelines are orchestrated. Makes it easier if you wind up having a bunch of workers move data in parallel during the migration window or if you need to do a rolling refresh of the destination for validation or something.
If you can get access to the source database, query the INFORMATION_SCHEMA views (select * from information_schema.columns) to get the data types of the fields in the source system. Same for the destination it sounds like, but once you have the columns and tables present in both systems you can probably start mapping it easily or chatbots probably do OK at it.