Imagine that you want to import your 5000+ items from database to ElasticSearch and you have restricted memory limits (e.g. 128MB). You will probably end up with PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted.

What you have to do?

You have to unconfigure logger, clear entity manager, each time, when you move your offset and unset $data in while/foreach.

Example here

namespace App\Command;

class ElasticImportCommand extends Command
    protected static $defaultName = 'elastic:import';
    protected function execute(InputInterface $input, OutputInterface $output): int
        $io = new SymfonyStyle($input, $output);
        //unset logger

        $offset = 0;
        $limit  = 5000;

        /** @var Query $query */
        while ($data = $repository->getRecipesToIndex($offset, $limit)) {
            $progressBar = new ProgressBar($io, count($data));

            foreach ($data as $recipe) {
                $indexer->scheduleIndex('recipe', new Document((string)$recipe->getId(), $this->createRecipeDto($recipe)));

            //unset data
            $offset += $limit;
            $io->writeln('offset: ' . $offset);
            //flush your indexer interface
            //clear entity manager - this is important

        return Command::SUCCESS;