最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Non-blocking file_put_contents in function.php

programmeradmin7浏览0评论

I have a function in function.php which is called on save_post. Currently, when the user is publishing or updating a post, this function will execute but it will take a lot of time.

What is the easiest way so this function run in a background process - non-blocking?

function export_all_in_json() {
   file_put_contents(ABSPATH.'all.json', fopen('/', 'r'));
}
add_action( 'save_post', 'export_all_in_json' );

I have a function in function.php which is called on save_post. Currently, when the user is publishing or updating a post, this function will execute but it will take a lot of time.

What is the easiest way so this function run in a background process - non-blocking?

function export_all_in_json() {
   file_put_contents(ABSPATH.'all.json', fopen('https://example/api/all/get_all_content/', 'r'));
}
add_action( 'save_post', 'export_all_in_json' );
Share Improve this question asked Jul 22, 2020 at 15:49 MustafaMustafa 35 bronze badges 9
  • Is there a reason this happens on save, and specifically always on save? Could it not be done in a cron job? Also why do you use raw fopen to fetch the data? – Tom J Nowell Commented Jul 22, 2020 at 15:56
  • The reason why I need to update 'all.json' immediately after save_post is because its content is tied to an application. I can have this in another stand-alone php file but is there a way to add a cron job and run it immediately after save_post? – Mustafa Commented Jul 22, 2020 at 16:10
  • The fundamental problem here is that you've taken the output and put it into file_put_contents, then focused on file_put_contents as the slow part. It's not. Fetching a remote resource is one of the slowest things you can do. It's the fopen call that's super super slow/expensive. That's what you need to make async, but you're passing it into file_put_contents, so you can't. Moving this to a cron job would make this process async, but then it wouldn't be immediate. WP Cron only runs if users visit. – Tom J Nowell Commented Jul 22, 2020 at 16:16
  • I believe there is almost certainly a much better approach, but, by making your question super generic, and hiding as much details behind an example API, it's impossible to advise on how to do that. There's just too little information about why it needs to poke the external API and fetch the JSON file to answer. E.g. do you control the other end? If so that opens up a tonne of options that aren't possible with your question as is. There's also the question of eventual consistency. There are also big bugs in the example you gave, e.g. it will make the request when items are added ot menus – Tom J Nowell Commented Jul 22, 2020 at 16:18
  • Then why download the JSON file at all, just make the other end send a request with the JSON every 10 minutes to your site, and eliminate the save_post filter entirely. Afterall you're not sending any information, if the goal is to keep a JSON file up to date, ask about that problem! Don't ask about a proposed solution instead – Tom J Nowell Commented Jul 22, 2020 at 16:25
 |  Show 4 more comments

2 Answers 2

Reset to default 1

Instead of contacting the remote site to get the information in an expensive HTTP request, why not have the remote site send you the data at regular intervals?

  1. Register a REST API endpoint to recieve the data on
    • In that endpoint, take in the data and save it in all.json
  2. On the remote site add a cron job
    • grab the data
    • make a non-blocking remote request to the site with this data

Now we have a completely asynchronous non-blocking system. Why fetch the data when it can be given to you?

This gives us several bonuses:

  • The save_post filter can be completely removed making post saving faster
  • This fixes a number of bugs in the filter by removing the filter
  • Updates to the file can happen even when no posts are being created
  • Updates to the file are now predictable and routine, e.g. every 5 minutes, or every hour
  • This avoids race conditions where multiple requests are sent to the API at the same time, resulting in extra server load and broken JSON files
  • Your API endpoint takes a little time to figure out the JSON data, so this gives you control over how often it happens, e.g. if the site is struggling change the cron job from 5 minutes too 10 minutes to ease the load
  • You could ping the API and tell it to trigger sending the data to the endpoint when a post is saved, rather than doing the full fetch and save. This would allow you to use a fetch paradigm and still have the advantages. It's similar to how some payment and authentication flows work too.

Maybe you can use Action Scheduler to trigger a single asyncronous event. Just like a cronjob but triggered one time.

https://actionscheduler/

发布评论

评论列表(0)

  1. 暂无评论