Calling Developers!
We are reenergizing our code contribution process! Learn More

What are the Slack Archives?

It’s a history of our time together in the Slack Community! There’s a ton of knowledge in here, so feel free to search through the archives for a possible answer to your question.

Because this space is not active, you won’t be able to create a new post or comment here. If you have a question or want to start a discussion about something, head over to our categories and pick one to post in! You can always refer back to a post from Slack Archives if needed; just copy the link to use it as a reference..

Hi all, we are experiencing a problem that messages in `sync.search.product` queue are not being pr

USZ0XG6SK
USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

Hi all,
we are experiencing a problem that messages in sync.search.product queue are not being processed (retried all over again). I was looking at the errors logs and couldn’t find anything specific for this queue. Only thing i found is the following:

bash-5.0# tail -f /var/log/spryker/php_errors.log
[13-Aug-2020 09:09:19 UTC] PHP Fatal error:  Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in /data/vendor/spryker/util-encoding/src/Spryker/Service/UtilEncoding/Model/Json.php on line 55
[13-Aug-2020 09:09:19 UTC] PHP Fatal error:  Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in /data/vendor/spryker/error-handler/src/Spryker/Shared/ErrorHandler/ErrorHandlerEnvironment.php on line 93

Those errors are being thrown in jenkins container every 5-7 seconds.
Did anyone experience these issues and have a suggestion how to fix it?

Comments

  • sprymiker
    sprymiker Cloud Platform Architect Sprykee Posts: 781 πŸ§‘πŸ»β€πŸš€ - Cadet

    Would it be that message in sync queue is too big? Probably reducing BULK size would help.

  • UKHD8KTMF
    UKHD8KTMF Posts: 393 πŸ§‘πŸ»β€πŸš€ - Cadet

    last time I had similar error there were messages that could not be processed. What happens is that when there is an error the worker tries to process messages individually and when it runs out of memory it fails and messages are not acknowledged. Reducing the batch size helps and Mike suggested.

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet
    edited August 2020

    as i can see default chunk size is set to 10000. What should be the reduced value in this case?

  • UKHD8KTMF
    UKHD8KTMF Posts: 393 πŸ§‘πŸ»β€πŸš€ - Cadet

    Chunk value out of the box is 500, I have no idea where 10000 is coming from.

  • UKHD8KTMF
    UKHD8KTMF Posts: 393 πŸ§‘πŸ»β€πŸš€ - Cadet

    I would set it to default and see if that helps and then reduce even more if you still get out of memory expection

  • Andriy Netseplyayev
    Andriy Netseplyayev Domain Lead Solution Architecture Sprykee Posts: 519 πŸ§‘πŸ»β€πŸš€ - Cadet

    500 might be for publishing, 10000 for syncing, it sounds fine. The elastic search document can be quite big, depending on your data. Putting the whole batch into memory can be troublesome. So yes, as a quick solution - try to reduce the chunk, but I would also check - aren’t you sending too much data to the elastic search?

  • sprymiker
    sprymiker Cloud Platform Architect Sprykee Posts: 781 πŸ§‘πŸ»β€πŸš€ - Cadet
    edited August 2020

    More attributes you have the bigger data you need to pump. (for products)

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

    I will try reducing the chunk size at first and then investigate further.

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

    We do have around 350 attributes per product

  • sprymiker
    sprymiker Cloud Platform Architect Sprykee Posts: 781 πŸ§‘πŸ»β€πŸš€ - Cadet

    That’s probably a root cause.

  • sprymiker
    sprymiker Cloud Platform Architect Sprykee Posts: 781 πŸ§‘πŸ»β€πŸš€ - Cadet

    Better to have chunk to be processed within 128 Mb of memory. IMO

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

    I reduced the chunk size to 500 and it is working now.

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

    Also checked the message payload size and this seems to be around 20kb at the moment, so i could probably go up to 5000 chunk size to stay within 128MB. Will play with it a bit.

  • USZ0XG6SK
    USZ0XG6SK Posts: 111 πŸ§‘πŸ»β€πŸš€ - Cadet

    Thank you for your help

  • sprymiker
    sprymiker Cloud Platform Architect Sprykee Posts: 781 πŸ§‘πŸ»β€πŸš€ - Cadet

    I meant 128 Mb for PHP used memory. Not the data itself.