The agent will mark extracted data as deleted, modified or added. Previously, using duplicate scripts would always make it impossible to know when data was deleted from the target website.Īn agent can now keep track of the latest changes that have been made to extracted data. New default duplicate scripts can be used to copy old data to the current data set, so you end up with a complete current data set. Old data can now be deleted, kept and exported, or kept for duplicate checks only. This is particular convenient when selecting web elements such as table rows, which cannot be selected directly in the web browser because they are completely covered by table cells. Click on the edge of a web element to select its parent. The embedded web browser is now based on Chrome and is completely self-contained, so no existing web browser is required on the computer where Content Grabber is running, and agent behavior no longer depends on existing browser configuration on the computer. Content Grabber 1 & 2 can run side-by-side on the same computer using the same license. IMPORTANT: Content Grabber 1 cannot be upgraded to version 2, so you must download and install the full version of Content Grabber 2. A command with multiple selection XPaths may select the same web element twice.XPath functions is-direct-url and is-same-domain don't work.This option now works together with the new Capture command option Merge Rows Method. The container command option Merge Rows Value Separator has been moved to Capture commands.For example, True could be returned if a match is found and False if a match is not found. can be used to return a specific value if a match is found and another value if a match is not found. When the parent container command option Export Method is set to Add Columns and Merge Rows, this option specifies how to combine row values. New Capture command option Merge Rows Method.The command selects all links by default, but the web selection can be modified so it only selects specific links. The number of links to follow can be limited by domain name, link depth and number of links. New command type Crawl Website can be used to follow all links on a website.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |