





1:58 pm

March 3, 2024

Hii Cyberseo
Currently, CyberSEO may re-process the same URL multiple times, leading to inefficient use of resources and potentially redundant actions (full text extractor, etc). This can occur especially when filters are in place, example skip to process url if content contain xxxx.
This idea to prevent redundant url processing.
CyberSEO maybe can logs processed URLs, it can be small TXT file (store last 100 url or 1000).
This would allow the system to check against this list before processing a URL, ensuring each unique URL is handled only once.
What do you mean by "processing the same URL multiple times"? If you're referring to different feeds importing the same URL independently, that's expected behavior. If you mean multiple processing of the same post within a single feed - please provide a log example. The plugin logs each cURL request step-by-step, so if that's happening, it should be visible.
As for storing processed URLs in a separate file - that's redundant. The plugin already checks for duplicates and applies filters during processing. An external TXT-based log wouldn't improve this process and might even slow it down.
In your example, the article wasn't published because it didn't pass the post filter due to insufficient content length. Since it wasn't added, the plugin didn't save its URL - and the next time it ran, it processed the same article again. This is expected and intended.
CyberSEO Pro keeps track of all successfully imported posts by storing their source URLs in the database. It compares each new item in the feed against this list to avoid duplicates. But if a post hasn't been added - due to filtering, extraction failure, API issues or anything else - the plugin will try again later. This is not a technical oversight, it's part of the core logic of the plugin.
Let's say you had a filter that blocked posts under 1000 characters. A post didn't pass, so it wasn't published. Later, you adjust the filter to 800 characters - or the original article is updated and now meets the requirement. Should the plugin ignore it just because it failed once? Of course not - that would mean permanently losing relevant content based on a temporary condition.
Also, just to clarify, what you shared is a compilation of two separate logs from different times - two independent attempts to process the same feed item. This is normal behavior and confirms that the plugin is working as intended.
Most Users Ever Online: 541
Currently Online:
8 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 86
harboot: 73
s.baryshev.aoasp: 68
Freedom: 61
Pandermos: 54
tormodg: 51
Member Stats:
Guest Posters: 337
Members: 3038
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1711
Posts: 8827
Newest Members:
hsun.arimagroup, acs.rudrajkash, cantaimok, gachugbu, administracion.zarabanda, williamjpruettAdministrators: CyberSEO: 4155