Dan Langille

I've been playing with Open Source software, starting with FreeBSD, since New Zealand Post installed DSL on my street in 1998. From there, I started writing at The FreeBSD Diary, moving my work here after I discovered WordPress. Along the way, I started the BSDCan and PGCon conferences. I slowly moved from software development into full time systems administration and now work for very-well known company who has been a big force in the security industry.

May 292021
 

Today at about 1:00 pm Philadelphia time, the website went offline. It was still there, it just could not answer any queries.

In this post:

  • Website and database server running in AWS
  • FreeBSD 12.2 on a
  • t3.medium EC2 instance
  • PostgreSQL 12.5 running on a db.t2.large RDS instance

Initial look

Looking at load, it was acceptable:

$ w
 7:09PM  up 43 days, 20:48, 1 user, load averages: 0.12, 0.26, 0.30
USER       TTY      FROM                                      LOGIN@  IDLE WHAT
ec2-user   pts/0    [redacted]                                7:09PM     - w

There were a lot of php-fpm: pool www (php-fpm) processes running.

Looking in the logs, I found:

[root@nginx01 /var/log]# tail -F php-fpm.log 
[29-May-2021 17:41:34] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 16 children, there are 17 idle, and 224 total children
[29-May-2021 17:41:41] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 8 children, there are 19 idle, and 234 total children
[29-May-2021 17:41:43] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 16 children, there are 17 idle, and 235 total children
[29-May-2021 17:41:44] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 32 children, there are 18 idle, and 238 total children
[29-May-2021 17:41:45] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 32 children, there are 18 idle, and 240 total children
[29-May-2021 17:41:46] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 32 children, there are 19 idle, and 242 total children
[29-May-2021 17:41:51] WARNING: [pool www] seems busy (you may need to increase pm.start_servers, or pm.min/max_spare_servers), spawning 8 children, there are 18 idle, and 249 total children
[29-May-2021 17:41:55] WARNING: [pool www] server reached pm.max_children setting (250), consider raising it
[29-May-2021 18:02:26] WARNING: [pool www] child 12962 exited with code 70 after 1759.560106 seconds from start
[29-May-2021 18:02:26] NOTICE: [pool www] child 22023 started
^C

Yeah, that’s a lot.

Looking at the RDS statistics, it was busy at about 98% load I think, I did not take a screen shot then. But here is the screen shot in hindsight, showing the fast build up of the number of sessions, from near zero to 250 and staying there for 2.5 hours

250 database sessions

250 database sessions

From this:

  • select * from elementGetCaseInsensitive() is suspect
  • this happened suddenly

The query

Here is the query:

   SELECT tmp.*, EP.pathname FROM (
      SELECT id,
             name::text,
             directory_file_flag::text,
             status::text,
             case when IsCategory(id) IS NULL THEN FALSE ELSE TRUE END,
             case when IsPort(    id) IS NULL THEN FALSE ELSE TRUE END
        FROM element
       WHERE id = (SELECT element_id
        FROM element_pathname
       WHERE lower(pathname) = lower($1))
     ) AS tmp JOIN element_pathname EP on tmp.id = EP.element_id;

And an explain analyse:

freshports.org=> explain analyse select * from elementGetCaseInsensitive('/ports/head/sysutils/nut');
                                                                       QUERY PLAN                                                                       
--------------------------------------------------------------------------------------------------------------------------------------------------------
 Nested Loop  (cost=29669.04..29685.59 rows=1 width=137) (actual time=6396.216..6396.221 rows=1 loops=1)
   InitPlan 1 (returns $0)
     ->  Seq Scan on element_pathname  (cost=0.00..29668.18 rows=5748 width=4) (actual time=4123.990..6395.702 rows=1 loops=1)
           Filter: (lower(pathname) = '/ports/head/sysutils/nut'::text)
           Rows Removed by Filter: 1149267
   ->  Index Scan using element_pkey on element  (cost=0.43..8.45 rows=1 width=23) (actual time=6395.745..6395.746 rows=1 loops=1)
         Index Cond: (id = $0)
   ->  Index Scan using element_pathname_element_id on element_pathname ep  (cost=0.43..8.45 rows=1 width=56) (actual time=0.008..0.009 rows=1 loops=1)
         Index Cond: (element_id = $0)
 Planning Time: 0.392 ms
 Execution Time: 6396.252 ms
(11 rows)

That’s terrible. Look at that sequential scan on over 1.1 million rows.

I know how to improve this: a new index.

The alteration

This is how I altered the table, first, the original definition:

freshports.org-> \d element_pathname
            Table "public.element_pathname"
   Column   |  Type   | Collation | Nullable | Default 
------------+---------+-----------+----------+---------
 element_id | integer |           | not null | 
 pathname   | text    |           | not null | 
Indexes:
    "element_pathname_pathname" UNIQUE, btree (pathname)
    "element_pathname_element_id" btree (element_id)
Foreign-key constraints:
    "element_pathname_element_id_fkey" FOREIGN KEY (element_id) REFERENCES element(id) ON DELETE CASCADE

freshports.org-> 

This is the index I added:

freshports.org=> begin;
BEGIN
freshports.org=> create index element_pathname_pathname_lc on element_pathname (lower(pathname));
CREATE INDEX
freshports.org=> commit;

The new explain analyze

Now that query runs quickly:

freshports.org=> explain analyse select * from elementGetCaseInsensitive('/ports/head/sysutils/nut');
                                                                       QUERY PLAN                                                                       
--------------------------------------------------------------------------------------------------------------------------------------------------------
 Nested Loop  (cost=10438.60..10455.15 rows=1 width=137) (actual time=0.292..0.297 rows=1 loops=1)
   InitPlan 1 (returns $0)
     ->  Bitmap Heap Scan on element_pathname  (cost=265.08..10437.74 rows=5746 width=4) (actual time=0.042..0.043 rows=1 loops=1)
           Recheck Cond: (lower(pathname) = '/ports/head/sysutils/nut'::text)
           Heap Blocks: exact=1
           ->  Bitmap Index Scan on element_pathname_pathname_lc  (cost=0.00..263.65 rows=5746 width=0) (actual time=0.037..0.038 rows=1 loops=1)
                 Index Cond: (lower(pathname) = '/ports/head/sysutils/nut'::text)
   ->  Index Scan using element_pkey on element  (cost=0.43..8.45 rows=1 width=23) (actual time=0.053..0.054 rows=1 loops=1)
         Index Cond: (id = $0)
   ->  Index Scan using element_pathname_element_id on element_pathname ep  (cost=0.43..8.45 rows=1 width=56) (actual time=0.007..0.008 rows=1 loops=1)
         Index Cond: (element_id = $0)
 Planning Time: 0.604 ms
 Execution Time: 0.332 ms
(13 rows)

freshports.org=> 

With that change, the website soon returned to its usually snappiness.

The stats show the results

This screen shows how the number of session is back to normal. see also that the query in question is now listed down under 0.01 seconds. Good.

sessions down near 0

sessions down near 0

I should have known to add this index with that new function was created. I don’t see an explain analyze in that post.

May 092021
 

This post is a how-to and reminder for myself. I’m working on finding a better way for FreshPorts to know a new physical ports category when it finds it. I think the only way is to look at the directories in the FreeBSD ports repo.

A discussion on IRC led to this shell script:

[dan@devgit-ingress01:/var/db/ingress/repos/ports] $ find -s -f * -type d -regex '[a-z].*' -maxdepth 0 | xargs | wc -w
      62

I will break that down for future reference:

  1. -s : traverse the file hierarchies in lexicographical; it gives us the output in sorted order
  2. -f * : avoids nasty situations where there might be a file named -foo in the tree. It also avoids use of find . which would give results with a leading ./ in the filenames
  3. -type d – only directories
  4. -regex ‘[a-z].*’ – match only items which start with a lower case character (i.e. ignore Mk, LEGAL, COPYRIGHT, .hooks
  5. -maxdepth 0 – stay in the top level directory
  6. | xargs – pipe the results onto one line
  7. | wc -w – count the number of words returned; included here only for testing purposes

The need for this script arose when FreshPorts encountered a commit for a .dotfile. While this wasn’t the first commit of such a file, it did bring my attention to the need to find a list of categories. The commit in question added the .hooks/prepare-commit-msg file to the ports tree. The code incorrectly concluded this was a new port (i.e. prepare-commit-msg). FreshPorts kept looking for a Makefile for this port, and never found it. It kept looking, by waiting, and trying again. The code could have seen, by looking in the repo, that this was a file, not a directory, and no Makefile was coming.

Instead, I’m taking a different approach. With each commit, the code will query the ports repo and get a list of the current categories. This list will be referenced and the code will know whether a given file affects a port or not. As I type this, that code is not yet written, but it will get written as I go through this process.

Part of the debug log

FreshPorts logs. A lot. The commit in question has 1307 lines of output:

[dan@devgit-ingress01:/var/db/freshports/message-queues/retry] $ wc -l 2021.04.20.09.58.35.000000.bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248.log
    1307 2021.04.20.09.58.35.000000.bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248.log

The file name is taken from the date/time that the file commit was processed by FreshPorts, in this case, 2021-04-20 at 09:58:35 (all times are UTC). The 000000 is a counter, so this was the first commit found during that processing time. We have room for 1,000,000 commits at a time before that counter rolls over. Next is the commit hash, which should be unique enough, but we still prefix the file name with that timestamp.

On line 115 of that log file, we find:

getting id from 'commit_log_elements_id_seq'
sql is insert into commit_log_elements(id, commit_log_id, element_id, revision_name, change_type) values 
                                        (4388001, 849545, 1215757, 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248', 'A')
sql = 'select ElementTagSet(1, 1215757, 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248')'
pushing the following onto @Files
FileAction='Add'
FilePath='/ports/head/.hooks/prepare-commit-msg'
FileRevision='bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248'
commit_log_element->{id}='4388001'
element_id='1215757'
Observer has noticed that commit 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248' contains file /ports/head/.hooks/prepare-commit-msg as revision bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248 in repos
STARTING _CompileListOfPorts ................................
for a commit on 'branch': 'head'
this commit is on head
FILE ==: Add, /ports/head/.hooks/prepare-commit-msg, bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248, ports, .hooks, prepare-commit-msg, 4388001
YES, this file is in the ports tree
checking for category='.hooks'
sql = "select * from categories where name = '.hooks'"
NOT FOUND
creating new category .hooks

Line 130 correctly determines that this file is in the ports tree.

Line 131 is where things start going wrong. This is not a category.

Ignored items

Way back when, a list of ignored items was created:

#
# These are the entries within /usr/ports/ which we ignore
# and /usr/ports/<category> which FreshPorts does not track
#
%FreshPorts::Constants::IgnoredItems = (
        "Attic"        => 1,
        "distfiles"    => 2,
        "Mk"           => 3,
        "Tools"        => 4,
        "Templates"    => 5,
        "Makefile"     => 6,
        "Makefile.inc" => 7,
        "CVSROOT"      => 8,
        "base"         => 9,
);

This was good enough for the early days. The list of IgnoredItems dates back to Fri Nov 9 16:30:29 2001 when it was a single string:

$FreshPorts::Constants::IgnoredItems = "Attic|distfiles|Mk|Tools|Templates|Makefile|pkg";

Today, I’m removing that and introducing a new module: FreshPorts::Catgories.

Getting the XML

Getting back to the original purpose of this post, creating XML.

Based on git commit processing – how is it done?, I found a reference to git-to-freshports-xml.py.

I tried this:

 $ ./git-to-freshports-xml.py
usage: git-to-freshports-xml.py [-h] -p PATH -O OUTPUT -S SPOOLING -r REPO [-o OS] [-f] [-v] [-l {syslog,stderr}] (-c COMMIT | -s SINGLE_COMMIT | -R COMMIT_RANGE)
git-to-freshports-xml.py: error: the following arguments are required: -p/--path, -O/--output, -S/--spooling, -r/--repo

Looking in the logs, I found this:

[dan@devgit-ingress01:~/scripts] $ sudo grep git-to-freshports-xml.py /var/log/freshports/git.log | head -1
2021.05.09 00:03:14 git-delta.sh /usr/local/libexec/freshports/git-to-freshports-xml.py --repo doc --path /var/db/ingress/repos/doc --commit cf1bad339628407060c88a1b20218f0c9660ba11 --spooling /var/db/ingress/message-queues/spooling --output /var/db/ingress/message-queues/incoming

Now I have something I can use to create the XML.

But if you run that, like I did, you’ll get thousands of XML files. I need to use -s instead of –commit.

This worked:

[dan@devgit-ingress01:~/scripts] $ /usr/local/libexec/freshports/git-to-freshports-xml.py --repo ports --path /var/db/ingress/repos/ports -s  bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248 --spooling /tmp/ --output ~/tmp

And created:

[dan@devgit-ingress01:~/tmp] $ cat 2021.04.20.09.58.35.000000.bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248.xml 
<?xml version='1.0' encoding='UTF-8'?>
<UPDATES Version="1.5.0.0" Source="git">
  <UPDATE>
    <DATE Year="2021" Month="4" Day="20"/>
    <TIME Timezone="UTC" Hour="9" Minute="58" Second="35"/>
    <OS Repo="ports" Id="FreeBSD" Branch="main"/>
    <LOG>Add the prepare-commit-msg hook to the repository.

To make use of it, the easiest way is to run:

  git config --add core.hooksPath .hooks

Discussed with:	bapt</LOG>
    <PEOPLE>
      <COMMITTER CommitterName="Mathieu Arnold" CommitterEmail="mat@FreeBSD.org"/>
      <AUTHOR AuthorName="Mathieu Arnold" AuthorEmail="mat@FreeBSD.org"/>
    </PEOPLE>
    <COMMIT Hash="bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248" HashShort="bbc2474" Subject="Add the prepare-commit-msg hook to the repository." EncodingLoses="false" Repository="ports"/>
    <FILES>
      <FILE Action="Add" Path=".hooks/prepare-commit-msg"/>
    </FILES>
  </UPDATE>
</UPDATES>
[dan@devgit-ingress01:~/tmp] $

The code change

The code change looks like this:

-                       # look for special files outside a port, such as LEGAL, GIDs, UIDs
-                       if ($subtree eq $FreshPorts::Config::ports_prefix && defined($FreshPorts::Constants::IgnoredItems{$category_name})) {
+                       # look for special files outside a port, such as LEGAL, GIDs, UIDs, .hooks
+                       if ( ! any {/$category_name/} @FreshPorts::Categories::categories ) {

That any function comes in via use List::MoreUtils ‘any’; and is part of lang/p5-List-MoreUtils, which is already a dependency on all FreshPorts hosts.

Reprocessing the errant commit

When I reprocessed that .hooks commit, I got this:

getting id from 'commit_log_elements_id_seq'
sql is insert into commit_log_elements(id, commit_log_id, element_id, revision_name, change_type) values 
                                        (4397803, 852396, 1217151, 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248', 'A')
sql = 'select ElementTagSet(1, 1217151, 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248')'
pushing the following onto @Files
FileAction='Add'
FilePath='/ports/head/.hooks/prepare-commit-msg'
FileRevision='bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248'
commit_log_element->{id}='4397803'
element_id='1217151'
Observer has noticed that commit 'bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248' contains file /ports/head/.hooks/prepare-commit-msg as revision bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248 in repos
STARTING _CompileListOfPorts ................................
for a commit on 'branch': 'head'
this commit is on head
FILE ==: Add, /ports/head/.hooks/prepare-commit-msg, bbc2474ef7a65eb8561c8ecf7af80c2bfed1f248, ports, .hooks, prepare-commit-msg, 4397803
YES, this file is in the ports tree
... but is not a file in a category on disk!

ENDING _CompileListOfPorts ................................

The code correctly determines that this is not a port, because .hooks is not in the list of categories.

Win. Thank you for coming to my TED talk.

May 082021
 

This post is related to a recent tweet.

It was months ago that I wondered why one port installed another as a dependency. I could see no link. Since it FreshPorts ports, earlier this week I did some testing: I removed the package, and it’s own dependencies from dev, then test, and finally stage.

Monitoring checks would soon detect if the #FreeBSD commits on each website started to deviate from the others, including production, which was not touched for this test.

By the end of the week, and after reviewing the code, it was safe to remove the dependency.

Today, that change hit production.

I like that the package system keeps track of dependencies. When that one package was removed, many others could be removed. i.e. its dependencies, & their dependencies, etc.

All up, 114 package were removed from the FreshPorts webserver.

pkg autoremove is a wonderful tool.

This was the pkg update:

[root@nginx01 /]# pkg upgrade
Updating local repository catalogue...
[nginx01] Fetching meta.conf: 100%    163 B   0.2kB/s    00:01    
[nginx01] Fetching packagesite.txz: 100%  267 KiB 273.0kB/s    00:01    
Processing entries: 100%
local repository update completed. 1127 packages processed.
All repositories are up to date.
Checking for upgrades (30 candidates): 100%
Processing candidates (30 candidates): 100%
The following 29 package(s) will be affected (of 0 checked):

Installed packages to be UPGRADED:
	bash: 5.1.4_2 -> 5.1.8
	freshports-www-git: 2.0.20 -> 2.0.22_1
	p5-CGI: 4.51 -> 4.52
	php74: 7.4.18 -> 7.4.19
	php74-ctype: 7.4.18 -> 7.4.19
	php74-dom: 7.4.18 -> 7.4.19
	php74-exif: 7.4.18 -> 7.4.19
	php74-fileinfo: 7.4.18 -> 7.4.19
	php74-filter: 7.4.18 -> 7.4.19
	php74-gd: 7.4.18 -> 7.4.19
	php74-iconv: 7.4.18 -> 7.4.19
	php74-json: 7.4.18 -> 7.4.19
	php74-mbstring: 7.4.18 -> 7.4.19
	php74-opcache: 7.4.18 -> 7.4.19
	php74-pgsql: 7.4.18 -> 7.4.19
	php74-phar: 7.4.18 -> 7.4.19
	php74-posix: 7.4.18 -> 7.4.19
	php74-session: 7.4.18 -> 7.4.19
	php74-simplexml: 7.4.18 -> 7.4.19
	php74-tokenizer: 7.4.18 -> 7.4.19
	php74-xml: 7.4.18 -> 7.4.19
	php74-xmlreader: 7.4.18 -> 7.4.19
	php74-xmlwriter: 7.4.18 -> 7.4.19
	php74-zip: 7.4.18 -> 7.4.19
	php74-zlib: 7.4.18 -> 7.4.19
	postfix: 3.5.10,1 -> 3.6.0,1
	py38-freshports-fp-listen-git: 1.0.2_4 -> 1.0.2_5
	py38-six: 1.15.0 -> 1.16.0
	sqlite3: 3.35.5,1 -> 3.35.5_1,1

Number of packages to be upgraded: 29

12 MiB to be downloaded.

Proceed with this action? [y/N]: y
[nginx01] [1/29] Fetching sqlite3-3.35.5_1,1.txz: 100%    1 MiB   1.3MB/s    00:01    
[nginx01] [2/29] Fetching py38-six-1.16.0.txz: 100%   19 KiB  19.5kB/s    00:01    
[nginx01] [3/29] Fetching py38-freshports-fp-listen-git-1.0.2_5.txz: 100%    8 KiB   7.8kB/s    00:01    
[nginx01] [4/29] Fetching postfix-3.6.0,1.txz: 100%    2 MiB   1.6MB/s    00:01    
[nginx01] [5/29] Fetching php74-zlib-7.4.19.txz: 100%   16 KiB  16.5kB/s    00:01    
[nginx01] [6/29] Fetching php74-zip-7.4.19.txz: 100%   19 KiB  19.6kB/s    00:01    
[nginx01] [7/29] Fetching php74-xmlwriter-7.4.19.txz: 100%   11 KiB  11.3kB/s    00:01    
[nginx01] [8/29] Fetching php74-xmlreader-7.4.19.txz: 100%   11 KiB  10.8kB/s    00:01    
[nginx01] [9/29] Fetching php74-xml-7.4.19.txz: 100%   18 KiB  18.1kB/s    00:01    
[nginx01] [10/29] Fetching php74-tokenizer-7.4.19.txz: 100%    7 KiB   7.7kB/s    00:01    
[nginx01] [11/29] Fetching php74-simplexml-7.4.19.txz: 100%   21 KiB  21.4kB/s    00:01    
[nginx01] [12/29] Fetching php74-session-7.4.19.txz: 100%   34 KiB  34.3kB/s    00:01    
[nginx01] [13/29] Fetching php74-posix-7.4.19.txz: 100%   11 KiB  10.8kB/s    00:01    
[nginx01] [14/29] Fetching php74-phar-7.4.19.txz: 100%  102 KiB 104.5kB/s    00:01    
[nginx01] [15/29] Fetching php74-pgsql-7.4.19.txz: 100%   40 KiB  41.3kB/s    00:01    
[nginx01] [16/29] Fetching php74-opcache-7.4.19.txz: 100%  198 KiB 202.8kB/s    00:01    
[nginx01] [17/29] Fetching php74-mbstring-7.4.19.txz: 100%  743 KiB 761.1kB/s    00:01    
[nginx01] [18/29] Fetching php74-json-7.4.19.txz: 100%   20 KiB  20.0kB/s    00:01    
[nginx01] [19/29] Fetching php74-iconv-7.4.19.txz: 100%   16 KiB  16.2kB/s    00:01    
[nginx01] [20/29] Fetching php74-gd-7.4.19.txz: 100%   29 KiB  29.4kB/s    00:01    
[nginx01] [21/29] Fetching php74-filter-7.4.19.txz: 100%   18 KiB  18.3kB/s    00:01    
[nginx01] [22/29] Fetching php74-fileinfo-7.4.19.txz: 100%  267 KiB 273.0kB/s    00:01    
[nginx01] [23/29] Fetching php74-exif-7.4.19.txz: 100%   27 KiB  28.2kB/s    00:01    
[nginx01] [24/29] Fetching php74-dom-7.4.19.txz: 100%   52 KiB  53.5kB/s    00:01    
[nginx01] [25/29] Fetching php74-ctype-7.4.19.txz: 100%    4 KiB   4.6kB/s    00:01    
[nginx01] [26/29] Fetching php74-7.4.19.txz: 100%    4 MiB   4.2MB/s    00:01    
[nginx01] [27/29] Fetching p5-CGI-4.52.txz: 100%  152 KiB 155.7kB/s    00:01    
[nginx01] [28/29] Fetching freshports-www-git-2.0.22_1.txz: 100%    2 MiB   2.4MB/s    00:01    
[nginx01] [29/29] Fetching bash-5.1.8.txz: 100%    2 MiB   1.6MB/s    00:01    
Checking integrity... done (1 conflicting)
  - freshports-www-git-2.0.22_1 conflicts with p5-freshports-scripts-git-2.0.16 on /usr/local/etc/freshports/config.sh.sample
Checking integrity... done (0 conflicting)
Conflicts with the existing packages have been found.
One more solver iteration is needed to resolve them.
The following 30 package(s) will be affected (of 0 checked):

Installed packages to be REMOVED:
	p5-freshports-scripts-git: 2.0.16

Installed packages to be UPGRADED:
	bash: 5.1.4_2 -> 5.1.8
	freshports-www-git: 2.0.20 -> 2.0.22_1
	p5-CGI: 4.51 -> 4.52
	php74: 7.4.18 -> 7.4.19
	php74-ctype: 7.4.18 -> 7.4.19
	php74-dom: 7.4.18 -> 7.4.19
	php74-exif: 7.4.18 -> 7.4.19
	php74-fileinfo: 7.4.18 -> 7.4.19
	php74-filter: 7.4.18 -> 7.4.19
	php74-gd: 7.4.18 -> 7.4.19
	php74-iconv: 7.4.18 -> 7.4.19
	php74-json: 7.4.18 -> 7.4.19
	php74-mbstring: 7.4.18 -> 7.4.19
	php74-opcache: 7.4.18 -> 7.4.19
	php74-pgsql: 7.4.18 -> 7.4.19
	php74-phar: 7.4.18 -> 7.4.19
	php74-posix: 7.4.18 -> 7.4.19
	php74-session: 7.4.18 -> 7.4.19
	php74-simplexml: 7.4.18 -> 7.4.19
	php74-tokenizer: 7.4.18 -> 7.4.19
	php74-xml: 7.4.18 -> 7.4.19
	php74-xmlreader: 7.4.18 -> 7.4.19
	php74-xmlwriter: 7.4.18 -> 7.4.19
	php74-zip: 7.4.18 -> 7.4.19
	php74-zlib: 7.4.18 -> 7.4.19
	postfix: 3.5.10,1 -> 3.6.0,1
	py38-freshports-fp-listen-git: 1.0.2_4 -> 1.0.2_5
	py38-six: 1.15.0 -> 1.16.0
	sqlite3: 3.35.5,1 -> 3.35.5_1,1

Number of packages to be removed: 1
Number of packages to be upgraded: 29

Proceed with this action? [y/N]: y
[nginx01] [1/30] Upgrading php74 from 7.4.18 to 7.4.19...
[nginx01] [1/30] Extracting php74-7.4.19: 100%
[nginx01] [2/30] Upgrading php74-dom from 7.4.18 to 7.4.19...
[nginx01] [2/30] Extracting php74-dom-7.4.19: 100%
[nginx01] [3/30] Upgrading php74-zlib from 7.4.18 to 7.4.19...
[nginx01] [3/30] Extracting php74-zlib-7.4.19: 100%
[nginx01] [4/30] Upgrading php74-xml from 7.4.18 to 7.4.19...
[nginx01] [4/30] Extracting php74-xml-7.4.19: 100%
[nginx01] [5/30] Upgrading php74-zip from 7.4.18 to 7.4.19...
[nginx01] [5/30] Extracting php74-zip-7.4.19: 100%
[nginx01] [6/30] Upgrading php74-xmlwriter from 7.4.18 to 7.4.19...
[nginx01] [6/30] Extracting php74-xmlwriter-7.4.19: 100%
[nginx01] [7/30] Upgrading php74-xmlreader from 7.4.18 to 7.4.19...
[nginx01] [7/30] Extracting php74-xmlreader-7.4.19: 100%
[nginx01] [8/30] Upgrading php74-tokenizer from 7.4.18 to 7.4.19...
[nginx01] [8/30] Extracting php74-tokenizer-7.4.19: 100%
[nginx01] [9/30] Upgrading php74-simplexml from 7.4.18 to 7.4.19...
[nginx01] [9/30] Extracting php74-simplexml-7.4.19: 100%
[nginx01] [10/30] Upgrading php74-session from 7.4.18 to 7.4.19...
[nginx01] [10/30] Extracting php74-session-7.4.19: 100%
[nginx01] [11/30] Upgrading php74-posix from 7.4.18 to 7.4.19...
[nginx01] [11/30] Extracting php74-posix-7.4.19: 100%
[nginx01] [12/30] Upgrading php74-phar from 7.4.18 to 7.4.19...
[nginx01] [12/30] Extracting php74-phar-7.4.19: 100%
[nginx01] [13/30] Upgrading php74-pgsql from 7.4.18 to 7.4.19...
[nginx01] [13/30] Extracting php74-pgsql-7.4.19: 100%
[nginx01] [14/30] Upgrading php74-opcache from 7.4.18 to 7.4.19...
[nginx01] [14/30] Extracting php74-opcache-7.4.19: 100%
[nginx01] [15/30] Upgrading php74-mbstring from 7.4.18 to 7.4.19...
[nginx01] [15/30] Extracting php74-mbstring-7.4.19: 100%
[nginx01] [16/30] Upgrading php74-json from 7.4.18 to 7.4.19...
[nginx01] [16/30] Extracting php74-json-7.4.19: 100%
[nginx01] [17/30] Upgrading php74-iconv from 7.4.18 to 7.4.19...
[nginx01] [17/30] Extracting php74-iconv-7.4.19: 100%
[nginx01] [18/30] Upgrading php74-gd from 7.4.18 to 7.4.19...
[nginx01] [18/30] Extracting php74-gd-7.4.19: 100%
[nginx01] [19/30] Upgrading php74-filter from 7.4.18 to 7.4.19...
[nginx01] [19/30] Extracting php74-filter-7.4.19: 100%
[nginx01] [20/30] Upgrading php74-fileinfo from 7.4.18 to 7.4.19...
[nginx01] [20/30] Extracting php74-fileinfo-7.4.19: 100%
[nginx01] [21/30] Upgrading php74-exif from 7.4.18 to 7.4.19...
[nginx01] [21/30] Extracting php74-exif-7.4.19: 100%
[nginx01] [22/30] Upgrading php74-ctype from 7.4.18 to 7.4.19...
[nginx01] [22/30] Extracting php74-ctype-7.4.19: 100%
[nginx01] [23/30] Deinstalling p5-freshports-scripts-git-2.0.16...
[nginx01] [23/30] Deleting files for p5-freshports-scripts-git-2.0.16: 100%
==> You should manually remove the "freshports" user. 
==> You should manually remove the "ingress" user. 
==> You should manually remove the "freshports" group 
==> You should manually remove the "ingress" group 
[nginx01] [24/30] Upgrading py38-freshports-fp-listen-git from 1.0.2_4 to 1.0.2_5...
===> Creating groups.
Using existing group 'freshports'.
===> Creating users
Using existing user 'freshports'.
===> Creating homedir(s)
[nginx01] [24/30] Extracting py38-freshports-fp-listen-git-1.0.2_5: 100%
[nginx01] [25/30] Upgrading sqlite3 from 3.35.5,1 to 3.35.5_1,1...
[nginx01] [25/30] Extracting sqlite3-3.35.5_1,1: 100%
[nginx01] [26/30] Upgrading py38-six from 1.15.0 to 1.16.0...
[nginx01] [26/30] Extracting py38-six-1.16.0: 100%
[nginx01] [27/30] Upgrading bash from 5.1.4_2 to 5.1.8...
[nginx01] [27/30] Extracting bash-5.1.8: 100%
[nginx01] [28/30] Upgrading postfix from 3.5.10,1 to 3.6.0,1...
===> Creating groups.
Using existing group 'mail'.
Using existing group 'maildrop'.
Using existing group 'postfix'.
===> Creating users
Using existing user 'postfix'.
===> Creating homedir(s)
[nginx01] [28/30] Extracting postfix-3.6.0,1: 100%

===============================================================
Postfix was *not* activated in /usr/local/etc/mail/mailer.conf! 

To finish installation run the following commands:

  mkdir -p /usr/local/etc/mail
  install -m 0644 /usr/local/share/postfix/mailer.conf.postfix /usr/local/etc/mail/mailer.conf
===============================================================

[nginx01] [29/30] Upgrading p5-CGI from 4.51 to 4.52...
[nginx01] [29/30] Extracting p5-CGI-4.52: 100%
[nginx01] [30/30] Upgrading freshports-www-git from 2.0.20 to 2.0.22_1...
===> Creating groups.
Using existing group 'freshports'.
Using existing group 'www'.
===> Creating users
Using existing user 'freshports'.
===> Creating homedir(s)
Using existing user 'www'.
[nginx01] [30/30] Extracting freshports-www-git-2.0.22_1: 100%
You may need to manually remove /usr/local/etc/php-fpm.conf if it is no longer needed.
You may need to manually remove /usr/local/etc/syslog.d/freshports.conf if it is no longer needed.
You may need to manually remove /usr/local/etc/postfix/main.cf if it is no longer needed.
=====
Message from postfix-3.6.0,1:

--
If you are upgrading from prior postfix version, please see the README
files for recommended changes to your configuration and additional
http://www.postfix.org/COMPATIBILITY_README.html

Incompatible change with postfix 3.5.x
==========================================
Internal protocols have changed. You need to "postfix stop" before
updating, or before backing out to an earlier release, otherwise
long-running daemons (pickup, qmgr, verify, tlsproxy, postscreen)
may fail to communicate with the rest of Postfix (warnings, timeouts).

The purpose of this change is to produce better error messages, for
example, when someone configures the discard daemon as a bounce
service in master.cf, or vice versa.

This change will break third-party code that implements a
Postfix-internal protocol such as qpsmtpd. Programs that depend on
Postfix internal details are not supported.
You may need to manually remove /usr/local/etc/freshports/constants.local.php if it is no longer needed.
You may need to manually remove /usr/local/etc/freshports/database.php if it is no longer needed.
You may need to manually remove /usr/local/etc/freshports/robots.txt if it is no longer needed.
You may need to manually remove /usr/local/etc/freshports/vhosts.conf.nginx if it is no longer needed.
You may need to manually remove /usr/local/etc/newsyslog.conf.d/freshports-www.conf if it is no longer needed.
You may need to manually remove /usr/local/etc/syslog.d/freshports-www.conf if it is no longer needed.

Then, the magic happens:

[root@nginx01 /]# pkg autoremove
Checking integrity... done (0 conflicting)
Deinstallation has been requested for the following 114 packages:

Installed packages to be REMOVED:
	apr: 1.7.0.1.6.1_1
	bash-completion: 2.11,2
	cvsps: 2.1_2
	db5: 5.3.28_7
	gdbm: 1.19
	git-lite: 2.31.1_1
	glib: 2.66.7_1,1
	gmp: 6.2.1
	gnupg: 2.3.1
	gnutls: 3.6.15
	libassuan: 2.5.5
	libgcrypt: 1.9.2_1
	libgit2: 1.1.0
	libgpg-error: 1.42
	libksba: 1.5.1
	libssh2: 1.9.0_3,3
	libtasn1: 4.16.0_1
	libxslt: 1.1.34_1
	nettle: 3.7.2_2
	npth: 1.6
	p11-kit: 0.23.22_1
	p5-Algorithm-C3: 0.11
	p5-Authen-NTLM: 1.09_1
	p5-Bit-Vector: 7.4
	p5-CGI: 4.52
	p5-Carp-Clan: 6.08
	p5-Class-C3: 0.35
	p5-Class-ISA: 0.36_1
	p5-Class-Method-Modifiers: 2.13
	p5-Class-Observable: 1.04_1
	p5-Clone: 0.45
	p5-DBD-Pg: 3.14.2
	p5-DBI: 1.643
	p5-Data-OptList: 0.110
	p5-Date-Calc: 6.4
	p5-Devel-StackTrace: 2.04
	p5-Digest-HMAC: 1.03_1
	p5-Email-Abstract: 3.008_1
	p5-Email-Address: 1.912
	p5-Email-Address-XS: 1.04
	p5-Email-Date-Format: 1.005
	p5-Email-MIME: 1.949
	p5-Email-MIME-ContentType: 1.026
	p5-Email-MIME-Encodings: 1.315_2
	p5-Email-MessageID: 1.406_1
	p5-Email-Sender: 1.300031_2
	p5-Email-Simple: 2.216
	p5-Encode-Locale: 1.05
	p5-Error: 0.17029
	p5-Exporter-Tiny: 1.002002
	p5-File-Listing: 6.14
	p5-HTML-Parser: 3.75
	p5-HTML-Tagset: 3.20_1
	p5-HTTP-Cookies: 6.10
	p5-HTTP-Daemon: 6.12
	p5-HTTP-Date: 6.05
	p5-HTTP-Message: 6.29
	p5-HTTP-Negotiate: 6.01_1
	p5-IO-HTML: 1.001_1
	p5-IO-Socket-INET6: 2.72_1
	p5-IO-Socket-SSL: 2.070
	p5-IO-String: 1.08_1
	p5-LWP-MediaTypes: 6.04
	p5-List-MoreUtils: 0.430
	p5-List-MoreUtils-XS: 0.430
	p5-MRO-Compat: 0.13
	p5-Module-Pluggable: 5.2
	p5-Module-Runtime: 0.016
	p5-Moo: 2.005004
	p5-MooX-Types-MooseLike: 0.29
	p5-Mozilla-CA: 20200520
	p5-Net-HTTP: 6.21
	p5-Net-SSLeay: 1.88
	p5-Params-Util: 1.102
	p5-Role-Tiny: 2.002004
	p5-Socket6: 0.29
	p5-Sub-Exporter: 0.987_1
	p5-Sub-Install: 0.928_1
	p5-Sub-Quote: 2.006006
	p5-Term-ReadKey: 2.38_1
	p5-Text-Unidecode: 1.30
	p5-Text-Wrapper: 1.05
	p5-Throwable: 0.200013
	p5-TimeDate: 2.33,1
	p5-Try-Tiny: 0.30
	p5-URI: 5.07
	p5-Unix-Syslog: 1.1_1
	p5-WWW-RobotRules: 6.02_1
	p5-XML-DOM: 1.44_1
	p5-XML-DOM-XPath: 0.14_1
	p5-XML-Node: 0.11_1
	p5-XML-Parser: 2.44
	p5-XML-RegExp: 0.04_1
	p5-XML-Writer: 0.900
	p5-XML-XPathEngine: 0.14_1
	p5-freshports-modules-git: 2.0.7
	p5-libwww: 6.53
	p5-libxml: 0.08_1
	p5-subversion: 1.14.1
	pinentry: 1.1.1
	pinentry-curses: 1.1.1
	py37-backports: 1
	py38-cached-property: 1.5.2
	py38-cffi: 1.14.5
	py38-pycparser: 2.20
	py38-pygit2: 1.5.0
	py38-six: 1.16.0
	python37: 3.7.10
	serf: 1.3.9_6
	sqlite3: 3.35.5_1,1
	subversion: 1.14.1
	tpm-emulator: 0.7.4_2
	trousers: 0.3.14_3
	utf8proc: 2.6.1

Number of packages to be removed: 114

The operation will free 298 MiB.

Proceed with deinstalling packages? [y/N]: y
[nginx01] [1/114] Deinstalling p5-subversion-1.14.1...
[nginx01] [1/114] Deleting files for p5-subversion-1.14.1: 100%
[nginx01] [2/114] Deinstalling p5-freshports-modules-git-2.0.7...
[nginx01] [2/114] Deleting files for p5-freshports-modules-git-2.0.7: 100%
==> You should manually remove the "freshports" user. 
==> You should manually remove the "ingress" user. 
==> You should manually remove the "freshports" group 
==> You should manually remove the "ingress" group 
[nginx01] [3/114] Deinstalling p5-XML-DOM-XPath-0.14_1...
[nginx01] [3/114] Deleting files for p5-XML-DOM-XPath-0.14_1: 100%
[nginx01] [4/114] Deinstalling subversion-1.14.1...
[nginx01] [4/114] Deleting files for subversion-1.14.1: 100%
[nginx01] [5/114] Deinstalling p5-XML-DOM-1.44_1...
[nginx01] [5/114] Deleting files for p5-XML-DOM-1.44_1: 100%
[nginx01] [6/114] Deinstalling gnupg-2.3.1...
[nginx01] [6/114] Deleting files for gnupg-2.3.1: 100%
[nginx01] [7/114] Deinstalling p5-Email-Sender-1.300031_2...
[nginx01] [7/114] Deleting files for p5-Email-Sender-1.300031_2: 100%
[nginx01] [8/114] Deinstalling p5-libwww-6.53...
[nginx01] [8/114] Deleting files for p5-libwww-6.53: 100%
[nginx01] [9/114] Deinstalling p5-CGI-4.52...
[nginx01] [9/114] Deleting files for p5-CGI-4.52: 100%
[nginx01] [10/114] Deinstalling gnutls-3.6.15...
[nginx01] [10/114] Deleting files for gnutls-3.6.15: 100%
[nginx01] [11/114] Deinstalling pinentry-1.1.1...
[nginx01] [11/114] Deleting files for pinentry-1.1.1: 100%
[nginx01] [12/114] Deinstalling py38-pygit2-1.5.0...
[nginx01] [12/114] Deleting files for py38-pygit2-1.5.0: 100%
[nginx01] [13/114] Deinstalling p5-Email-Abstract-3.008_1...
[nginx01] [13/114] Deleting files for p5-Email-Abstract-3.008_1: 100%
[nginx01] [14/114] Deinstalling p5-Throwable-0.200013...
[nginx01] [14/114] Deleting files for p5-Throwable-0.200013: 100%
[nginx01] [15/114] Deinstalling p5-HTML-Parser-3.75...
[nginx01] [15/114] Deleting files for p5-HTML-Parser-3.75: 100%
[nginx01] [16/114] Deinstalling p5-HTTP-Cookies-6.10...
[nginx01] [16/114] Deleting files for p5-HTTP-Cookies-6.10: 100%
[nginx01] [17/114] Deinstalling p5-HTTP-Daemon-6.12...
[nginx01] [17/114] Deleting files for p5-HTTP-Daemon-6.12: 100%
[nginx01] [18/114] Deinstalling p5-HTTP-Negotiate-6.01_1...
[nginx01] [18/114] Deleting files for p5-HTTP-Negotiate-6.01_1: 100%
[nginx01] [19/114] Deinstalling p5-Net-HTTP-6.21...
[nginx01] [19/114] Deleting files for p5-Net-HTTP-6.21: 100%
[nginx01] [20/114] Deinstalling serf-1.3.9_6...
[nginx01] [20/114] Deleting files for serf-1.3.9_6: 100%
[nginx01] [21/114] Deinstalling p11-kit-0.23.22_1...
[nginx01] [21/114] Deleting files for p11-kit-0.23.22_1: 100%
[nginx01] [22/114] Deinstalling trousers-0.3.14_3...
[nginx01] [22/114] Deleting files for trousers-0.3.14_3: 100%
==> You should manually remove the "_tss" user. 
==> You should manually remove the "_tss" group 
[nginx01] [23/114] Deinstalling pinentry-curses-1.1.1...
[nginx01] [23/114] Deleting files for pinentry-curses-1.1.1: 100%
[nginx01] [24/114] Deinstalling py38-cffi-1.14.5...
[nginx01] [24/114] Deleting files for py38-cffi-1.14.5: 100%
[nginx01] [25/114] Deinstalling libxslt-1.1.34_1...
[nginx01] [25/114] Deleting files for libxslt-1.1.34_1: 100%
[nginx01] [26/114] Deinstalling p5-MRO-Compat-0.13...
[nginx01] [26/114] Deleting files for p5-MRO-Compat-0.13: 100%
[nginx01] [27/114] Deinstalling p5-Date-Calc-6.4...
[nginx01] [27/114] Deleting files for p5-Date-Calc-6.4: 100%
[nginx01] [28/114] Deinstalling p5-Moo-2.005004...
[nginx01] [28/114] Deleting files for p5-Moo-2.005004: 100%
[nginx01] [29/114] Deinstalling p5-HTTP-Message-6.29...
[nginx01] [29/114] Deleting files for p5-HTTP-Message-6.29: 100%
[nginx01] [30/114] Deinstalling p5-Sub-Exporter-0.987_1...
[nginx01] [30/114] Deleting files for p5-Sub-Exporter-0.987_1: 100%
[nginx01] [31/114] Deinstalling p5-Email-MIME-1.949...
[nginx01] [31/114] Deleting files for p5-Email-MIME-1.949: 100%
[nginx01] [32/114] Deinstalling p5-File-Listing-6.14...
[nginx01] [32/114] Deleting files for p5-File-Listing-6.14: 100%
[nginx01] [33/114] Deinstalling p5-IO-Socket-SSL-2.070...
[nginx01] [33/114] Deleting files for p5-IO-Socket-SSL-2.070: 100%
[nginx01] [34/114] Deinstalling apr-1.7.0.1.6.1_1...
[nginx01] [34/114] Deleting files for apr-1.7.0.1.6.1_1: 100%
[nginx01] [35/114] Deinstalling bash-completion-2.11,2...
[nginx01] [35/114] Deleting files for bash-completion-2.11,2: 100%
[nginx01] [36/114] Deinstalling glib-2.66.7_1,1...
[nginx01] [36/114] Deleting files for glib-2.66.7_1,1: 100%
[nginx01] [37/114] Deinstalling nettle-3.7.2_2...
[nginx01] [37/114] Deleting files for nettle-3.7.2_2: 100%
[nginx01] [38/114] Deinstalling tpm-emulator-0.7.4_2...
[nginx01] [38/114] Deleting files for tpm-emulator-0.7.4_2: 100%
==> You should manually remove the "_tss" user. 
==> You should manually remove the "_tss" group 
[nginx01] [39/114] Deinstalling libassuan-2.5.5...
[nginx01] [39/114] Deleting files for libassuan-2.5.5: 100%
[nginx01] [40/114] Deinstalling libgcrypt-1.9.2_1...
[nginx01] [40/114] Deleting files for libgcrypt-1.9.2_1: 100%
[nginx01] [41/114] Deinstalling libgit2-1.1.0...
[nginx01] [41/114] Deleting files for libgit2-1.1.0: 100%
[nginx01] [42/114] Deinstalling libksba-1.5.1...
[nginx01] [42/114] Deleting files for libksba-1.5.1: 100%
[nginx01] [43/114] Deinstalling p5-Class-C3-0.35...
[nginx01] [43/114] Deleting files for p5-Class-C3-0.35: 100%
[nginx01] [44/114] Deinstalling p5-Authen-NTLM-1.09_1...
[nginx01] [44/114] Deleting files for p5-Authen-NTLM-1.09_1: 100%
[nginx01] [45/114] Deinstalling p5-Bit-Vector-7.4...
[nginx01] [45/114] Deleting files for p5-Bit-Vector-7.4: 100%
[nginx01] [46/114] Deinstalling p5-Class-Observable-1.04_1...
[nginx01] [46/114] Deleting files for p5-Class-Observable-1.04_1: 100%
[nginx01] [47/114] Deinstalling p5-Role-Tiny-2.002004...
[nginx01] [47/114] Deleting files for p5-Role-Tiny-2.002004: 100%
[nginx01] [48/114] Deinstalling p5-DBD-Pg-3.14.2...
[nginx01] [48/114] Deleting files for p5-DBD-Pg-3.14.2: 100%
[nginx01] [49/114] Deinstalling p5-Data-OptList-0.110...
[nginx01] [49/114] Deleting files for p5-Data-OptList-0.110: 100%
[nginx01] [50/114] Deinstalling p5-Email-Simple-2.216...
[nginx01] [50/114] Deleting files for p5-Email-Simple-2.216: 100%
[nginx01] [51/114] Deinstalling p5-Email-MIME-ContentType-1.026...
[nginx01] [51/114] Deleting files for p5-Email-MIME-ContentType-1.026: 100%
[nginx01] [52/114] Deinstalling p5-List-MoreUtils-0.430...
[nginx01] [52/114] Deleting files for p5-List-MoreUtils-0.430: 100%
[nginx01] [53/114] Deinstalling p5-HTTP-Date-6.05...
[nginx01] [53/114] Deleting files for p5-HTTP-Date-6.05: 100%
[nginx01] [54/114] Deinstalling p5-IO-Socket-INET6-2.72_1...
[nginx01] [54/114] Deleting files for p5-IO-Socket-INET6-2.72_1: 100%
[nginx01] [55/114] Deinstalling p5-MooX-Types-MooseLike-0.29...
[nginx01] [55/114] Deleting files for p5-MooX-Types-MooseLike-0.29: 100%
[nginx01] [56/114] Deinstalling p5-WWW-RobotRules-6.02_1...
[nginx01] [56/114] Deleting files for p5-WWW-RobotRules-6.02_1: 100%
[nginx01] [57/114] Deinstalling p5-XML-Node-0.11_1...
[nginx01] [57/114] Deleting files for p5-XML-Node-0.11_1: 100%
[nginx01] [58/114] Deinstalling p5-libxml-0.08_1...
[nginx01] [58/114] Deleting files for p5-libxml-0.08_1: 100%
[nginx01] [59/114] Deinstalling py37-backports-1...
[nginx01] [59/114] Deleting files for py37-backports-1: 100%
[nginx01] [60/114] Deinstalling py38-cached-property-1.5.2...
[nginx01] [60/114] Deleting files for py38-cached-property-1.5.2: 100%
[nginx01] [61/114] Deinstalling py38-pycparser-2.20...
[nginx01] [61/114] Deleting files for py38-pycparser-2.20: 100%
[nginx01] [62/114] Deinstalling cvsps-2.1_2...
[nginx01] [62/114] Deleting files for cvsps-2.1_2: 100%
[nginx01] [63/114] Deinstalling db5-5.3.28_7...
[nginx01] [63/114] Deleting files for db5-5.3.28_7: 100%
[nginx01] [64/114] Deinstalling git-lite-2.31.1_1...
[nginx01] [64/114] Deleting files for git-lite-2.31.1_1: 100%
==> You should manually remove the "git_daemon" user. 
==> You should manually remove the "git_daemon" group 
[nginx01] [65/114] Deinstalling gdbm-1.19...
[nginx01] [65/114] Deleting files for gdbm-1.19: 100%
[nginx01] [66/114] Deinstalling gmp-6.2.1...
[nginx01] [66/114] Deleting files for gmp-6.2.1: 100%
[nginx01] [67/114] Deinstalling sqlite3-3.35.5_1,1...
[nginx01] [67/114] Deleting files for sqlite3-3.35.5_1,1: 100%
[nginx01] [68/114] Deinstalling libgpg-error-1.42...
[nginx01] [68/114] Deleting files for libgpg-error-1.42: 100%
[nginx01] [69/114] Deinstalling libssh2-1.9.0_3,3...
[nginx01] [69/114] Deleting files for libssh2-1.9.0_3,3: 100%
[nginx01] [70/114] Deinstalling libtasn1-4.16.0_1...
[nginx01] [70/114] Deleting files for libtasn1-4.16.0_1: 100%
[nginx01] [71/114] Deinstalling npth-1.6...
[nginx01] [71/114] Deleting files for npth-1.6: 100%
[nginx01] [72/114] Deinstalling p5-Algorithm-C3-0.11...
[nginx01] [72/114] Deleting files for p5-Algorithm-C3-0.11: 100%
[nginx01] [73/114] Deinstalling p5-Carp-Clan-6.08...
[nginx01] [73/114] Deleting files for p5-Carp-Clan-6.08: 100%
[nginx01] [74/114] Deinstalling p5-Class-ISA-0.36_1...
[nginx01] [74/114] Deleting files for p5-Class-ISA-0.36_1: 100%
[nginx01] [75/114] Deinstalling p5-Class-Method-Modifiers-2.13...
[nginx01] [75/114] Deleting files for p5-Class-Method-Modifiers-2.13: 100%
[nginx01] [76/114] Deinstalling p5-Clone-0.45...
[nginx01] [76/114] Deleting files for p5-Clone-0.45: 100%
[nginx01] [77/114] Deinstalling p5-DBI-1.643...
[nginx01] [77/114] Deleting files for p5-DBI-1.643: 100%
[nginx01] [78/114] Deinstalling p5-Devel-StackTrace-2.04...
[nginx01] [78/114] Deleting files for p5-Devel-StackTrace-2.04: 100%
[nginx01] [79/114] Deinstalling p5-Digest-HMAC-1.03_1...
[nginx01] [79/114] Deleting files for p5-Digest-HMAC-1.03_1: 100%
[nginx01] [80/114] Deinstalling p5-Email-Address-1.912...
[nginx01] [80/114] Deleting files for p5-Email-Address-1.912: 100%
[nginx01] [81/114] Deinstalling p5-Email-Address-XS-1.04...
[nginx01] [81/114] Deleting files for p5-Email-Address-XS-1.04: 100%
[nginx01] [82/114] Deinstalling p5-Email-Date-Format-1.005...
[nginx01] [82/114] Deleting files for p5-Email-Date-Format-1.005: 100%
[nginx01] [83/114] Deinstalling p5-Email-MIME-Encodings-1.315_2...
[nginx01] [83/114] Deleting files for p5-Email-MIME-Encodings-1.315_2: 100%
[nginx01] [84/114] Deinstalling p5-Email-MessageID-1.406_1...
[nginx01] [84/114] Deleting files for p5-Email-MessageID-1.406_1: 100%
[nginx01] [85/114] Deinstalling p5-Encode-Locale-1.05...
[nginx01] [85/114] Deleting files for p5-Encode-Locale-1.05: 100%
[nginx01] [86/114] Deinstalling p5-Error-0.17029...
[nginx01] [86/114] Deleting files for p5-Error-0.17029: 100%
[nginx01] [87/114] Deinstalling p5-Exporter-Tiny-1.002002...
[nginx01] [87/114] Deleting files for p5-Exporter-Tiny-1.002002: 100%
[nginx01] [88/114] Deinstalling p5-HTML-Tagset-3.20_1...
[nginx01] [88/114] Deleting files for p5-HTML-Tagset-3.20_1: 100%
[nginx01] [89/114] Deinstalling p5-IO-HTML-1.001_1...
[nginx01] [89/114] Deleting files for p5-IO-HTML-1.001_1: 100%
[nginx01] [90/114] Deinstalling p5-IO-String-1.08_1...
[nginx01] [90/114] Deleting files for p5-IO-String-1.08_1: 100%
[nginx01] [91/114] Deinstalling p5-LWP-MediaTypes-6.04...
[nginx01] [91/114] Deleting files for p5-LWP-MediaTypes-6.04: 100%
[nginx01] [92/114] Deinstalling p5-List-MoreUtils-XS-0.430...
[nginx01] [92/114] Deleting files for p5-List-MoreUtils-XS-0.430: 100%
[nginx01] [93/114] Deinstalling p5-Module-Pluggable-5.2...
[nginx01] [93/114] Deleting files for p5-Module-Pluggable-5.2: 100%
[nginx01] [94/114] Deinstalling p5-Module-Runtime-0.016...
[nginx01] [94/114] Deleting files for p5-Module-Runtime-0.016: 100%
[nginx01] [95/114] Deinstalling p5-Mozilla-CA-20200520...
[nginx01] [95/114] Deleting files for p5-Mozilla-CA-20200520: 100%
[nginx01] [96/114] Deinstalling p5-Net-SSLeay-1.88...
[nginx01] [96/114] Deleting files for p5-Net-SSLeay-1.88: 100%
[nginx01] [97/114] Deinstalling p5-Params-Util-1.102...
[nginx01] [97/114] Deleting files for p5-Params-Util-1.102: 100%
[nginx01] [98/114] Deinstalling p5-Socket6-0.29...
[nginx01] [98/114] Deleting files for p5-Socket6-0.29: 100%
[nginx01] [99/114] Deinstalling p5-Sub-Install-0.928_1...
[nginx01] [99/114] Deleting files for p5-Sub-Install-0.928_1: 100%
[nginx01] [100/114] Deinstalling p5-Sub-Quote-2.006006...
[nginx01] [100/114] Deleting files for p5-Sub-Quote-2.006006: 100%
[nginx01] [101/114] Deinstalling p5-Term-ReadKey-2.38_1...
[nginx01] [101/114] Deleting files for p5-Term-ReadKey-2.38_1: 100%
[nginx01] [102/114] Deinstalling p5-Text-Unidecode-1.30...
[nginx01] [102/114] Deleting files for p5-Text-Unidecode-1.30: 100%
[nginx01] [103/114] Deinstalling p5-Text-Wrapper-1.05...
[nginx01] [103/114] Deleting files for p5-Text-Wrapper-1.05: 100%
[nginx01] [104/114] Deinstalling p5-TimeDate-2.33,1...
[nginx01] [104/114] Deleting files for p5-TimeDate-2.33,1: 100%
[nginx01] [105/114] Deinstalling p5-Try-Tiny-0.30...
[nginx01] [105/114] Deleting files for p5-Try-Tiny-0.30: 100%
[nginx01] [106/114] Deinstalling p5-URI-5.07...
[nginx01] [106/114] Deleting files for p5-URI-5.07: 100%
[nginx01] [107/114] Deinstalling p5-Unix-Syslog-1.1_1...
[nginx01] [107/114] Deleting files for p5-Unix-Syslog-1.1_1: 100%
[nginx01] [108/114] Deinstalling p5-XML-Parser-2.44...
[nginx01] [108/114] Deleting files for p5-XML-Parser-2.44: 100%
[nginx01] [109/114] Deinstalling p5-XML-RegExp-0.04_1...
[nginx01] [109/114] Deleting files for p5-XML-RegExp-0.04_1: 100%
[nginx01] [110/114] Deinstalling p5-XML-Writer-0.900...
[nginx01] [110/114] Deleting files for p5-XML-Writer-0.900: 100%
[nginx01] [111/114] Deinstalling p5-XML-XPathEngine-0.14_1...
[nginx01] [111/114] Deleting files for p5-XML-XPathEngine-0.14_1: 100%
[nginx01] [112/114] Deinstalling py38-six-1.16.0...
[nginx01] [112/114] Deleting files for py38-six-1.16.0: 100%
[nginx01] [113/114] Deinstalling python37-3.7.10...
[nginx01] [113/114] Deleting files for python37-3.7.10: 100%
[nginx01] [114/114] Deinstalling utf8proc-2.6.1...
[nginx01] [114/114] Deleting files for utf8proc-2.6.1: 100%
[root@nginx01 /]# 
Apr 132021
 

This post is mostly for personal use as I investigate this topic. It may take a few days and I need to keep notes.

Last week, FreshPorts went live with git commits. There are some outstanding issues, such as commits to branches, which I will catch up on in the weeks to come. During that update process, I noticed two tables which I thought were unused. I made a note for me to investigate them later. Those tables are:

freshports.devgit=# \dt commits_latest*
                List of relations
 Schema |         Name         | Type  |  Owner   
--------+----------------------+-------+----------
 public | commits_latest       | table | postgres
 public | commits_latest_ports | table | postgres
(2 rows)

freshports.devgit=# 

Intent

I recall the intent of those tables: to retain a list of the most recent commits for quick display on the home page and in news feeds. I’ve gotten better at SQL since those days and I see they contain no data:

freshports.devgit=# select count(*) from commits_latest;
 count 
-------
     0
(1 row)

freshports.devgit=# select count(*) from commits_latest_ports;
 count 
-------
     0
(1 row)

freshports.devgit=# 

What do the tables look like

For the record, they look like.

freshports.devgit=# \d commits_latest
                         Table "public.commits_latest"
       Column       |           Type           | Collation | Nullable | Default 
--------------------+--------------------------+-----------+----------+---------
 commit_log_id      | integer                  |           |          | 
 commit_date_raw    | timestamp with time zone |           |          | 
 message_subject    | text                     |           |          | 
 message_id         | text                     |           |          | 
 committer          | text                     |           |          | 
 commit_description | text                     |           |          | 
 commit_date        | text                     |           |          | 
 commit_time        | text                     |           |          | 
 element_id         | integer                  |           |          | 
 element_name       | text                     |           |          | 
 revision_name      | text                     |           |          | 
 status             | character(1)             |           |          | 
 encoding_losses    | boolean                  |           |          | 
 element_pathname   | text                     |           |          | 

freshports.devgit=# \d commits_latest_ports
                     Table "public.commits_latest_ports"
    Column     |            Type             | Collation | Nullable | Default 
---------------+-----------------------------+-----------+----------+---------
 commit_log_id | integer                     |           | not null | 
 commit_date   | timestamp without time zone |           | not null | 
Indexes:
    "commits_latest_ports_pkey" PRIMARY KEY, btree (commit_log_id)
Foreign-key constraints:
    "$1" FOREIGN KEY (commit_log_id) REFERENCES commit_log(id) ON UPDATE CASCADE ON DELETE CASCADE

freshports.devgit=# 

Does the website use it?

No, it’s not in use there:

[dan@devgit-nginx01:~/www] $ grep -ri commits_latest *
grep: configuration/robots.txt: Permission denied
grep: configuration/vhosts.conf: Permission denied
grep: configuration/vhosts.conf.nginx: No such file or directory
grep: configuration/virtualhost-common.conf: Permission denied
grep: configuration/status-config.php: No such file or directory
grep: configuration/database.php: Permission denied
docs/physical_database-2016-06-27.svg: >rel_commits_latest_ports_commit_log
docs/physical_database-2016-06-27.svg: >commits_latest
docs/physical_database-2016-06-27.svg: >commits_latest_ports
grep: www/robots.txt: Permission denied
[dan@devgit-nginx01:~/www] $ 

What about the backend?

No, it’s not there either:

[dan@devgit-ingress01:~/scripts] $ grep -ri commits_latest *
[dan@devgit-ingress01:~/scripts] $ cd ~/modules
[dan@devgit-ingress01:~/modules] $ grep -ri commits_latest *
grep: config.pm: Permission denied
[dan@devgit-ingress01:~/modules] $ 

What about the database

Perhaps it’s used by the store procedures, functions, relation integrity, etc:

[dan@pro02:~/src/freshports/database-schema-git] $ grep -ri commits_latest *
FreshPorts2.pdm:commits_latest
FreshPorts2.pdm:commits_latest
FreshPorts2.pdm:commits_latest
FreshPorts2.pdm:PK_COMMITS_LATEST
FreshPorts2.pdm:commits_latest_ports
FreshPorts2.pdm:commits_latest_ports
FreshPorts2.pdm:commits_latest_ports
createdb.sql:create table commits_latest
createdb.sql:create table commits_latest_ports
ri.txt:       WHERE (NOT (EXISTS (SELECT commits_latest_ports.commit_log_id
ri.txt:                             FROM latest_commits_ports commits_latest_ports
ri.txt:                            WHERE (commits_latest_ports.commit_log_id = new.commit_log_id)
sp.txt:       WHERE (NOT (EXISTS (SELECT commits_latest_ports.commit_log_id
sp.txt:                             FROM latest_commits_ports commits_latest_ports
sp.txt:                            WHERE (commits_latest_ports.commit_log_id = CommitLogID)
[dan@pro02:~/src/freshports/database-schema-git] $ 

Only ri.txt and sp.txt are relevant here:

From ri.txt:

CREATE OR REPLACE FUNCTION commit_log_ports_insert () RETURNS TRIGGER
AS '
BEGIN
   INSERT INTO latest_commits_ports (commit_log_id, commit_date)
              SELECT new.commit_log_id,
            (SELECT commit_log.commit_date
               FROM commit_log
              WHERE (commit_log.id = new.commit_log_id)
            )
       WHERE (NOT (EXISTS (SELECT commits_latest_ports.commit_log_id
                             FROM latest_commits_ports commits_latest_ports
                            WHERE (commits_latest_ports.commit_log_id = new.commit_log_id)
                           )
                   )
             );
   RETURN new;
END;'
LANGUAGE plpgsql;

  DROP TRIGGER IF EXISTS commit_log_ports_insert ON commit_log_ports;
CREATE TRIGGER commit_log_ports_insert
AFTER INSERT ON commit_log_ports
FOR EACH ROW
EXECUTE PROCEDURE commit_log_ports_insert ();

For each row added to commit_log_ports, invoke the trigger commit_log_ports_insert().

I know commit_log_ports is used. I don’t know about latest_commits_ports.

It seems we have two more tables to review:

freshports.devgit=# \dt latest_commits*
                List of relations
 Schema |         Name         | Type  |  Owner   
--------+----------------------+-------+----------
 public | latest_commits       | table | postgres
 public | latest_commits_ports | table | postgres
(2 rows)

Those tables do seem to be used:

freshports.devgit=# select count(*) from latest_commits;
 count  
--------
 693798
(1 row)

freshports.devgit=# select count(*) from latest_commits_ports;
 count  
--------
 159794
(1 row)

From sp.txt:

CREATE OR REPLACE FUNCTION commit_log_ports_insert(int) RETURNS boolean
AS $$
DECLARE
        CommitLogID     ALIAS for $1;
BEGIN
   INSERT INTO latest_commits_ports (commit_log_id, commit_date)
              SELECT CommitLogID, 
            (SELECT commit_log.commit_date
               FROM commit_log
              WHERE (commit_log.id = CommitLogID)
            )
       WHERE (NOT (EXISTS (SELECT commits_latest_ports.commit_log_id
                             FROM latest_commits_ports commits_latest_ports
                            WHERE (commits_latest_ports.commit_log_id = CommitLogID)
                           )
                   )
             );

   RETURN FOUND;
END;$$

There is nothing here not already covered by <span class="file">ri.txt</span>.

LANGUAGE plpgsql;

Who uses latest_commits_ports?

I found this function:

CREATE OR REPLACE FUNCTION latest_commits_ports_anchor () RETURNS integer
    AS $$
  SELECT commit_log_id AS RESULT
    FROM latest_commits_ports
ORDER BY commit_log_id
   LIMIT 1;
$$
    LANGUAGE sql 
STABLE;

I question this function.

This is used by:

  1. LatestCommitsLarge()
  2. LatestCommitsLargeFiltered()
  3. LatestCommitsSmall()
  4. But those functions do stuff like this:

                          JOIN commit_log_ports_elements CLPE on CLPE.commit_log_id = LCPCL.commit_log_id
                                                             AND CLPE.commit_log_id > latest_commits_ports_anchor()
    

The above are used by:

  1. LatestCommits()
  2. LatestCommitsFiltered()

Look at classes/latest_commits.php:

                if (IsSet($this->Filter)) {
                        $sql = "select * from LatestCommitsFiltered($this->MaxNumberOfPorts, $this->UserID, '" . pg_escape_string($this->Filter) . "')";
                } else {
#                       $sql = "select * from LatestCommits($this->MaxNumberOfPorts, $this->UserID)";
                        $sql = "
  SELECT LC.*, STF.message AS stf_message
    FROM LatestCommits(" . pg_escape_string($this->MaxNumberOfPorts) . ", 0, '" . pg_escape_string($this->BranchName) . "') LC LEFT OUTER JOIN sanity_test_failures STF
      ON LC.commit_log_id = STF.commit_log_id
ORDER BY LC.commit_date_raw DESC, LC.category, LC.port, element_pathname";
                }

This is used by commits.php and filter.php (the latter of which has no links to to and it seems it can be deleted).

I also see reference to

Actions

While the above, I keep this section up to date as I went.

  1. Examine LatestCommitsLarge() vs LatestCommitsSmall() – do we still need both?
  2. LatestCommitsLarge is the one used by commits.php – does the AND CLPE.commit_log_id > latest_commits_ports_anchor() clause have useful effect?
  3. remove filter.php
  4. drop functions LatestCommitsFiltered() & LatestCommitsLargeFiltered()
  5. remove www/caching-files directory
  6. Reimplement www/caching-files/categories.php via a backend script, perhaps periodic
  7. look at scripts/cache-refresh.sh which hasn’t been invoked in production since 2017, according to the contents of the category_stats table which is displayed on categories.php page.
  8. The CategoryStatsUpdate() function might benefit from this:
    SELECT categories.name, categories.id               AS category_id,
             count(ports_active.id)      AS count,
             max(commit_log.commit_date) AS updated,
    		 case when categories.is_primary THEN '' ELSE '*' END
        FROM ports_categories, categories, ports_active left outer join commit_log on ( ports_active.last_commit_id = commit_log.id )
       WHERE ports_active.id = ports_categories.port_id
         AND categories.id   = ports_categories.category_id
    GROUP BY categories.id, categories.name, categories.description, is_primary, categories.element_id
    order by categories.name
    
Mar 302021
 

Hello,

The changeover from subversion to git for the FreeBSD project enters it’s final stage about three hours from now (as I type this). At 3:01 UTC on Mar 31 2021, the repository will become read-only. The full schedule for that process is in the FreeBSD wiki.

The first git commit is not expected until Apr 3rd.

That gives FreshPorts a couple of days to convert from subversion to git. The coding has been completed. The database needs to be modified, the new code installed, and configuration updates.

This change to FreshPorts represents the final stage of work which started over 13 months ago.

This process has had a few dry runs and it takes a few hours. Given the 3 day timeframe, the FreshPorts plans is laid out below.

  1. Drop the TTL because we will change to a new server.
  2. Disable user logins & disable account updates – a backup will be taken and any subsequent updates to production would not appear in the new database. About 12:30 UTC on 31 March
  3. Production will continue to serve up content during the updates.
  4. The new git-enabled website should be ready by about 20:00 UTC on 31 March.

The goals include:

  1. Apart from logins-disabled, you should not notice any change in service during the transition
  2. The plan is to reproduce the website on an AWS instance – this will be our first venture into that venue
  3. Service will gradually moved to the new website by use of multiple A and AAAA records.

Please watch our Twitter account and status page for updates.

Mar 092021
 

I had noticed a problem with my full text search solution when I was testing. If the input query contained a quote, an error resulted on the query. At the time, I attributed that to a poor input/text handling. Later, I realized I was wrong.

On the website, this PHP code is used:

$result = pg_query_params($this->dbh, $sql, array($this->Query));

Where $this->Query is what the user supplied on the search form. The pg_query_params documentation says

The primary advantage of pg_query_params() over pg_query() is that parameter values may be separated from the query string, thus avoiding the need for tedious and error-prone quoting and escaping.

So I don’t need to do anything special.

Here is some testing I tried while talking this over via IRC:

freshports.dev=# SELECT to_tsquery('english', $$Joe's stuff$$);
ERROR:  syntax error in tsquery: "Joe's stuff"

freshports.dev=# SELECT to_tsquery('english', $$Joes stuff$$);
ERROR:  syntax error in tsquery: "Joes stuff"

freshports.dev=# SELECT to_tsquery('english', $$Joe stuff$$);
ERROR:  syntax error in tsquery: "Joe stuff"

freshports.dev=# SELECT to_tsquery('english', $$Joestuff$$);
 to_tsquery 
------------
 'joestuff'
(1 row)

It turns out the solution is to use websearch_to_tsquery() instead. The documentation mentions:

Moreover, this function should never raise syntax errors, which makes it possible to use raw user-supplied input for search.

That sounds ideal, and it was. It completely fixes the issue.

freshports.dev=# SELECT websearch_to_tsquery('english', $$Joe's stuff$$);
 websearch_to_tsquery 
----------------------
 'joe' & 'stuff'
(1 row)

freshports.dev=# SELECT websearch_to_tsquery('english', $$Joes stuff$$);
 websearch_to_tsquery 
----------------------
 'joe' & 'stuff'
(1 row)

freshports.dev=# SELECT websearch_to_tsquery('english', $$Joe stuff$$);
 websearch_to_tsquery 
----------------------
 'joe' & 'stuff'
(1 row)

Great. Hope this helps.

Mar 092021
 

In yesterdays blog post, Replacing a column search with a full text search solution, I wanted to mention how adding a LIMIT 10 to a fast running query made it run slower. I didn’t remember that issue until this morning. Fortunately, the fix was low cost.

The query

This is the query I ran:

EXPLAIN ANALYSE
WITH short_list AS (
    SELECT
        port_id
    FROM
        generate_plist
    WHERE
        textsearchable_index_col @@ to_tsquery('share/examples/acme.sh/dnsapi/dns_1984hosting.sh')
)
  select CL.commit_date - SystemTimeAdjust() AS last_commit_date, 
         P.id, 
         E.name as port,
         C.name as category, 
...
         P.uses  
       FROM short_list, ports P
       LEFT OUTER JOIN ports_vulnerable    PV  ON PV.port_id       = P.id
       LEFT OUTER JOIN commit_log          CL  ON P.last_commit_id = CL.id,
       element_pathname EP,
       categories C, element E 
 WHERE P.id = short_list.port_id
   AND P.element_id = EP.element_id and EP.pathname like '/ports/head/%'
   AND P.category_id  = C.id
   AND P.element_id   = E.id  AND E.status = 'A';

The plan (as found at https://explain.depesz.com/s/Bkkg) was fast enough for me.

Adding LIMIT 10

This was used on the FreshPorts search page (not in production, but on the dev website). It needs a limit, so I added LIMIT 10.

The plan changed to a sequential scan of a large table. Boom, 28 seconds. That is unsatisfactory. The plan is at https://explain.depesz.com/s/oJZB.

The Fix

The fix, as pointed out by Thomas Hurst in his reply to my tweet centered around a CTE (Common Table Expressions). That initially confused me, because the query was already using a CTE. That’s the WITH short_list AS part of the query.

Then I read the AS MATERIALIZED part of his reply and started reading up on that. I changed the query to:

EXPLAIN ANALYSE
WITH short_list AS MATERIALIZED (
    SELECT
        port_id
    FROM
        generate_plist
    WHERE
        textsearchable_index_col @@ to_tsquery('share/examples/acme.sh/dnsapi/dns_1984hosting.sh')
)
  select CL.commit_date - SystemTimeAdjust() AS last_commit_date, 
         P.id, 
         E.name as port,
         C.name as category, 
...
         P.uses  
       FROM short_list, ports P
       LEFT OUTER JOIN ports_vulnerable    PV  ON PV.port_id       = P.id
       LEFT OUTER JOIN commit_log          CL  ON P.last_commit_id = CL.id,
       element_pathname EP,
       categories C, element E 
 WHERE P.id = short_list.port_id
   AND P.element_id = EP.element_id and EP.pathname like '/ports/head/%'
   AND P.category_id  = C.id
   AND P.element_id   = E.id  AND E.status = 'A'
LIMIT 10;

You can see the change on line 2: AS MATERIALIZED.

The plan (as found at https://explain.depesz.com/s/UBps) takes 1.4s and is back to using an index scan.

Further reading:

Mar 082021
 

One of the many useful features on FreshPorts is: what port[s] install[s] this file? That’s the pkg-plist search option. pkg-plist is a file which “lists all the files installed by the port”. However not all ports have a pkg-plist file because the list is compiled automatically. That is why the configure_plist table was created to replace the ports.pkg_plist column. The creation of that table broke the search function because it was then working on outdated data.

Last week, I blogged about how I created a new a new stored procedure for pulling back the pkg-plist information.

The new table

By new, I mean new to this search solution. The table looks like this:

freshports.dev=# \d generate_plist
                                Table "public.generate_plist"
     Column     |  Type   | Collation | Nullable |                  Default                   
----------------+---------+-----------+----------+--------------------------------------------
 id             | bigint  |           | not null | nextval('generate_plist_id_seq'::regclass)
 port_id        | integer |           | not null | 
 installed_file | text    |           | not null | 
Indexes:
    "generate_plist_installed_file_gin_idx" gin (to_tsvector('english'::regconfig, installed_file))
    "generate_plist_installed_file_idx" btree (installed_file)
    "generate_plist_pk" btree (id)
    "generate_plist_port_id_idx" btree (port_id)

freshports.dev=# 

For a given port, there will be N entries in this table, one for each item in the pkg-plist. You will notice some indexes on the installed_file column.

A simple search

With the old solution, we could search like this, but it took time:

AND P.pkg_plist ILIKE '%share/examples/acme.sh/dnsapi/dns_1984hosting.sh%'

Over 2.7 seconds in fact.

Let’s try looking in the generate_plist table instead. I was reading the Tables and Indexes section of the PostgreSQL Full Text Search and decided to start with this:

ALTER TABLE public.generate_plist
    ADD COLUMN textsearchable_index_col tsvector GENERATED ALWAYS AS (to_tsvector('english'::regconfig, installed_file)) STORED;

CREATE INDEX generate_plist_textsearch_idx
    ON public.generate_plist USING gin
    (textsearchable_index_col)
    TABLESPACE pg_default;

With that, the following search was pretty damn nice.

freshports.dev=# explain analyse    SELECT
        port_id
    FROM
        generate_plist
    WHERE
        textsearchable_index_col @@ to_tsquery('share/examples/acme.sh/dnsapi/dns_1984hosting.sh')
;
                                                                QUERY PLAN                                                                
------------------------------------------------------------------------------------------------------------------------------------------
 Bitmap Heap Scan on generate_plist  (cost=367.55..2143.46 rows=426 width=4) (actual time=2.116..2.123 rows=2 loops=1)
   Recheck Cond: (textsearchable_index_col @@ to_tsquery('share/examples/acme.sh/dnsapi/dns_1984hosting.sh'::text))
   Heap Blocks: exact=2
   ->  Bitmap Index Scan on generate_plist_textsearch_idx  (cost=0.00..367.44 rows=426 width=0) (actual time=2.095..2.095 rows=2 loops=1)
         Index Cond: (textsearchable_index_col @@ to_tsquery('share/examples/acme.sh/dnsapi/dns_1984hosting.sh'::text))
 Planning Time: 0.434 ms
 Execution Time: 2.208 ms
(7 rows)

But this was … umm, what?

freshports.dev=# SELECT
        port_id
    FROM
        generate_plist
    WHERE
        textsearchable_index_col @@ to_tsquery('dns_1984hosting.sh')
;
 port_id 
---------
(0 rows)

freshports.dev=# 

Nothing found? What?

Full text search is not like that

Full text search is not like that. My search on the full path name would have worked just fine with the existing index:

freshports.dev=# explain analyse SELECT
        port_id
    FROM
        generate_plist
    WHERE
        installed_file = 'share/examples/acme.sh/dnsapi/dns_1984hosting.sh';
                                                                    QUERY PLAN                                                                     
---------------------------------------------------------------------------------------------------------------------------------------------------
 Index Scan using generate_plist_installed_file_idx on generate_plist  (cost=0.56..28.66 rows=6 width=4) (actual time=0.046..0.052 rows=2 loops=1)
   Index Cond: (installed_file = 'share/examples/acme.sh/dnsapi/dns_1984hosting.sh'::text)
 Planning Time: 0.222 ms
 Execution Time: 0.088 ms
(4 rows)

freshports.dev=# 

In fact, it’s faster.

But that’s not what is needed. Let’s look at the use case.

Searching for file names

We want to search for the full path or the file name. Or just dns_1984hosting.sh or dns_1984hosting.

With help from RhodiumToad on IRC, tsearch is all about matching words (technically lexemes), not arbitrary substrings. We need to think in terms of lexemes. For example:

freshports.dev=# select ts_debug('english', 'foo/bar');
                           ts_debug                           
--------------------------------------------------------------
 (file,"File or path name",foo/bar,{simple},simple,{foo/bar})
(1 row)

freshports.dev=# 

So let’s split the file name into distinct words. Let’s convert the / to space and add a new column for that:

ALTER TABLE public.generate_plist
    ADD COLUMN textsearchable_index_col2 tsvector GENERATED ALWAYS AS (to_tsvector('english'::regconfig, translate(installed_file, '/'::text, ' '::text))) STORED;

CREATE INDEX generate_plist_textsearch_idx2
    ON public.generate_plist USING gin
    (textsearchable_index_col2)
    TABLESPACE pg_default;

Let’s try searching now:

freshports.dev=# EXPLAIN ANALYSE
WITH short_list AS (
    SELECT
        port_id, installed_file
    FROM
        generate_plist
    WHERE
     textsearchable_index_col2 @@ to_tsquery('dns_1984hosting.sh')
)
  select P.id, element_pathname(P.element_id), SL.installed_file
  FROM ports P, short_list SL
 WHERE P.id = SL.port_id;
                                                                    QUERY PLAN                                                                    
--------------------------------------------------------------------------------------------------------------------------------------------------
 Nested Loop  (cost=171.64..25352.59 rows=2981 width=88) (actual time=1.337..1.506 rows=2 loops=1)
   ->  Bitmap Heap Scan on generate_plist  (cost=171.35..12102.68 rows=2981 width=56) (actual time=1.031..1.040 rows=2 loops=1)
         Recheck Cond: (textsearchable_index_col2 @@ to_tsquery('dns_1984hosting.sh'::text))
         Heap Blocks: exact=2
         ->  Bitmap Index Scan on generate_plist_textsearch_idx2  (cost=0.00..170.61 rows=2981 width=0) (actual time=1.015..1.015 rows=2 loops=1)
               Index Cond: (textsearchable_index_col2 @@ to_tsquery('dns_1984hosting.sh'::text))
   ->  Index Scan using ports_pkey on ports p  (cost=0.29..4.19 rows=1 width=8) (actual time=0.014..0.014 rows=1 loops=2)
         Index Cond: (id = generate_plist.port_id)
 Planning Time: 1.064 ms
 Execution Time: 1.596 ms
(10 rows)

freshports.dev=# 

Umm, that’s fast.

But if we try the file without the .sh we get nothing:

freshports.dev=# WITH short_list AS (
    SELECT
        port_id, installed_file
    FROM
        generate_plist
    WHERE
     textsearchable_index_col2 @@ to_tsquery('dns_1984hosting')
)
  select P.id, element_pathname(P.element_id), SL.installed_file
  FROM ports P, short_list SL
 WHERE P.id = SL.port_id;
 id | element_pathname | installed_file 
----+------------------+----------------
(0 rows)

freshports.dev=# 

Another column for that

Humor me here while I add another column:

ALTER TABLE public.generate_plist
    ADD COLUMN textsearchable_index_col3 tsvector GENERATED ALWAYS AS (to_tsvector('english'::regconfig, translate(installed_file, '/.'::text, '  '::text))) STORED;

CREATE INDEX generate_plist_textsearch_idx3
    ON public.generate_plist USING gin
    (textsearchable_index_col3)
    TABLESPACE pg_default;

This translates both / and . to a space. Now we have:

freshports.dev=# WITH short_list AS (
    SELECT
        port_id, installed_file
    FROM
        generate_plist
    WHERE
     textsearchable_index_col3 @@ to_tsquery('dns_1984hosting')
)
  select P.id, element_pathname(P.element_id), SL.installed_file
  FROM ports P, short_list SL
 WHERE P.id = SL.port_id;
  id   |            element_pathname             |                  installed_file                  
-------+-----------------------------------------+--------------------------------------------------
 59654 | /ports/branches/2020Q3/security/acme.sh | share/examples/acme.sh/dnsapi/dns_1984hosting.sh
 43508 | /ports/head/security/acme.sh            | share/examples/acme.sh/dnsapi/dns_1984hosting.sh
(2 rows)

freshports.dev=# 

Which is still lightning fast.

Doing them all at once

Creating a query which uses all three columns is still performant:

freshports.dev=# explain analyse WITH short_list AS (
    SELECT
        DISTINCT port_id, installed_file
    FROM
        generate_plist
    WHERE
        textsearchable_index_col  @@ to_tsquery('bacula')
     OR textsearchable_index_col2 @@ to_tsquery('bacula')
     OR textsearchable_index_col3 @@ to_tsquery('bacula')
)
  select P.id, element_pathname(P.element_id), SL.installed_file
  FROM ports P, short_list SL
 WHERE P.id = SL.port_id;
                                                                                                       QUERY PLAN                                                                                                       
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
 Hash Join  (cost=50810.64..53335.38 rows=7790 width=88) (actual time=273.782..520.103 rows=1872 loops=1)
   Hash Cond: (sl.port_id = p.id)
   ->  Subquery Scan on sl  (cost=35602.05..35757.85 rows=7790 width=56) (actual time=14.561..16.078 rows=1872 loops=1)
         ->  HashAggregate  (cost=35602.05..35679.95 rows=7790 width=56) (actual time=14.558..15.495 rows=1872 loops=1)
               Group Key: generate_plist.port_id, generate_plist.installed_file
               ->  Bitmap Heap Scan on generate_plist  (cost=1705.16..35563.02 rows=7806 width=56) (actual time=11.540..12.771 rows=1872 loops=1)
                     Recheck Cond: ((textsearchable_index_col @@ to_tsquery('bacula'::text)) OR (textsearchable_index_col2 @@ to_tsquery('bacula'::text)) OR (textsearchable_index_col3 @@ to_tsquery('bacula'::text)))
                     Heap Blocks: exact=116
                     ->  BitmapOr  (cost=1705.16..1705.16 rows=7808 width=0) (actual time=11.489..11.490 rows=0 loops=1)
                           ->  Bitmap Index Scan on generate_plist_textsearch_idx  (cost=0.00..415.44 rows=426 width=0) (actual time=2.994..2.995 rows=36 loops=1)
                                 Index Cond: (textsearchable_index_col @@ to_tsquery('bacula'::text))
                           ->  Bitmap Index Scan on generate_plist_textsearch_idx2  (cost=0.00..170.61 rows=2981 width=0) (actual time=1.373..1.374 rows=1857 loops=1)
                                 Index Cond: (textsearchable_index_col2 @@ to_tsquery('bacula'::text))
                           ->  Bitmap Index Scan on generate_plist_textsearch_idx3  (cost=0.00..1113.26 rows=4401 width=0) (actual time=7.117..7.117 rows=1872 loops=1)
                                 Index Cond: (textsearchable_index_col3 @@ to_tsquery('bacula'::text))
   ->  Hash  (cost=14174.04..14174.04 rows=63004 width=8) (actual time=258.282..258.283 rows=63065 loops=1)
         Buckets: 32768  Batches: 4  Memory Usage: 877kB
         ->  Seq Scan on ports p  (cost=0.00..14174.04 rows=63004 width=8) (actual time=0.089..219.485 rows=63065 loops=1)
 Planning Time: 1.227 ms
 Execution Time: 521.244 ms
(20 rows)

freshports.dev=# 

At what cost?

What does the table look like now:

freshports.dev=# \d generate_plist
                                                                          Table "public.generate_plist"
          Column           |   Type   | Collation | Nullable |                                                      Default                                                      
---------------------------+----------+-----------+----------+-------------------------------------------------------------------------------------------------------------------
 id                        | bigint   |           | not null | nextval('generate_plist_id_seq'::regclass)
 port_id                   | integer  |           | not null | 
 installed_file            | text     |           | not null | 
 textsearchable_index_col  | tsvector |           |          | generated always as (to_tsvector('english'::regconfig, installed_file)) stored
 textsearchable_index_col2 | tsvector |           |          | generated always as (to_tsvector('english'::regconfig, translate(installed_file, '/'::text, ' '::text))) stored
 textsearchable_index_col3 | tsvector |           |          | generated always as (to_tsvector('english'::regconfig, translate(installed_file, '/.'::text, '  '::text))) stored
Indexes:
    "generate_plist_installed_file_gin_idx" gin (to_tsvector('english'::regconfig, installed_file))
    "generate_plist_installed_file_idx" btree (installed_file)
    "generate_plist_pk" btree (id)
    "generate_plist_port_id_idx" btree (port_id)
    "generate_plist_textsearch_idx" gin (textsearchable_index_col)
    "generate_plist_textsearch_idx2" gin (textsearchable_index_col2)
    "generate_plist_textsearch_idx3" gin (textsearchable_index_col3)

How much space does this take:



freshports.dev=# select count(*) from generate_plist;
  count  
---------
 8478513
(1 row)


freshports.dev=# SELECT pg_table_size('generate_plist');
 pg_table_size 
---------------
    3578167296
(1 row)

freshports.dev=# select pg_total_relation_size('generate_plist');
 pg_total_relation_size 
------------------------
             6730162176
(1 row)

That’s 8.5 million rows taking up 3.3 GB but with indexes it’s 6.3 GB.

Disk is cheap. Time is expensive. I’d save the cost is worth the performance.

Mar 022021
 

In this post I will show you how I discovered that the backend was putting data into a new table (generate_plist) but the website was still using the ports.pkg_plist column. The result was stale data being shown on the website (for older ports) or no data at all (for newer ports).

How long? Since Oct 2017, because that’s when configure_plist showed up on the website. Nobody noticed until recently. I can only guess that I made the backend changes with intent to change the frontend later. I should have created myself a ticket.

Why was the change made? Primarily because all ports, especially Python based ports, have a pkg-plist file. These ports generate the list automatically at package-built time. Secondly, it also simplifies the search of pkg-plist.

Ironically, this bug also affects search, which is still using the old column.

Background

One set of data which FreshPorts shows is the list of files installed by a given package. This information can be provided via a file, called pkg-plist.

Here is one example for the comms/gnuradio port:

freshports.dev=# select pkg_plist from ports where id = 34724;
             pkg_plist              
------------------------------------
 include/libconcord.h              +
 lib/libconcord.a                  +
 lib/libconcord.so                 +
 lib/libconcord.so.4               +
 lib/libconcord.so.4.0.0           +
 share/mime/packages/libconcord.xml
(1 row)

freshports.dev=# 

NOTE: the above is one row of output. It is a collection of strings, separated by newlines.

If visit the website, you’ll see that list is wrong, because it’s outdated.

The generate-plist command is part of the ports infrastructure. FreshPorts gathers this data with this command and the resulting file.

$ make generate-plist
$ cat work/.PLIST.mktmp

A recent blog post goes into more detail. All that goes into the configure_plist table:

freshports.dev=# \d configure_plist
Did not find any relation named "configure_plist".
freshports.dev=# \d generate_plist
                                Table "public.generate_plist"
     Column     |  Type   | Collation | Nullable |                  Default                   
----------------+---------+-----------+----------+--------------------------------------------
 id             | bigint  |           | not null | nextval('generate_plist_id_seq'::regclass)
 port_id        | integer |           | not null | 
 installed_file | text    |           | not null | 
Indexes:
    "generate_plist_installed_file_gin_idx" gin (to_tsvector('english'::regconfig, installed_file))
    "generate_plist_installed_file_idx" btree (installed_file)
    "generate_plist_pk" btree (id)
    "generate_plist_port_id_idx" btree (port_id)

freshports.dev=# 

The original data

This is what the data looked like, as demonstrated by this query:

freshports.dev=# select pkg_plist, array_to_json(regexp_match(pkg_plist, 'lib/[[:alpha:]]*?\.so')) from ports where id = 34724;
             pkg_plist              |     array_to_json     
------------------------------------+-----------------------
 include/libconcord.h              +| ["lib/libconcord.so"]
 lib/libconcord.a                  +| 
 lib/libconcord.so                 +| 
 lib/libconcord.so.4               +| 
 lib/libconcord.so.4.0.0           +| 
 share/mime/packages/libconcord.xml | 
(1 row)

NOTE: This example user :alpha: but I later changed this to
:alnum: in my code.

NOTE: The above is one row.

The goal is to grab all the .so libraries.

The new layout

The new table, with refreshed data, looks like:

freshports.dev=# select * from generate_plist where port_id = 34724;
    id    | port_id |                       installed_file                       
----------+---------+------------------------------------------------------------
 27791797 |   34724 | @shared-mime-info share/mime
 27791798 |   34724 | /usr/local/share/licenses/libconcord-1.2_2/catalog.mk
 27791799 |   34724 | /usr/local/share/licenses/libconcord-1.2_2/LICENSE
 27791800 |   34724 | /usr/local/share/licenses/libconcord-1.2_2/GPLv2
 27791801 |   34724 | include/libconcord.h
 27791802 |   34724 | lib/libconcord.a
 27791803 |   34724 | lib/libconcord.so
 27791804 |   34724 | lib/libconcord.so.4
 27791805 |   34724 | lib/libconcord.so.4.0.0
 27791806 |   34724 | share/mime/packages/libconcord.xml
 27791807 |   34724 | @postexec /usr/sbin/service ldconfig restart > /dev/null
 27791808 |   34724 | @postunexec /usr/sbin/service ldconfig restart > /dev/null
(12 rows)

freshports.dev=# 

First go

NOTE: I have removed the array_to_json() call for now:

freshports.dev=# select installed_file, regexp_match(installed_file, 'lib/[[:alpha:]]*?\.so') 
from generate_plist where port_id = 34724;
                       installed_file                       |    regexp_match     
------------------------------------------------------------+---------------------
 @shared-mime-info share/mime                               | 
 /usr/local/share/licenses/libconcord-1.2_2/catalog.mk      | 
 /usr/local/share/licenses/libconcord-1.2_2/LICENSE         | 
 /usr/local/share/licenses/libconcord-1.2_2/GPLv2           | 
 include/libconcord.h                                       | 
 lib/libconcord.a                                           | 
 lib/libconcord.so                                          | {lib/libconcord.so}
 lib/libconcord.so.4                                        | {lib/libconcord.so}
 lib/libconcord.so.4.0.0                                    | {lib/libconcord.so}
 share/mime/packages/libconcord.xml                         | 
 @postexec /usr/sbin/service ldconfig restart > /dev/null   | 
 @postunexec /usr/sbin/service ldconfig restart > /dev/null | 
(12 rows)

That looks good so far. What I did not realize until later: each value of the regexp_match output is an array.

Let’s try this on a port with more libraries:

freshports.dev=# select distinct regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
   from generate_plist
  WHERE generate_plist.port_id = 9589 and regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so')  is not null;
        lib         
--------------------
 {lib/libhamlib.so}
 {lib/hamlibtcl.so}
 {lib/Hamlib.so}
(3 rows)

That still looks good.

Getting rid of duplicate values and ignoring null

I thought this was a simple way to get rid of the uninteresting values. I like using WITH. It allows you to take a very complex set of data and adjust it before performing additional operations upon it. What I’m doing here is not very complicated and I am sure it can be done without WITH.

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 9589
)
select distinct lib from tmp where lib is not null;
        lib         
--------------------
 {lib/libhamlib.so}
 {lib/hamlibtcl.so}
 {lib/Hamlib.so}
(3 rows)

JSON time

In this section, there are three main changes.

Convert the results from an array to the first element of that array. We a dealing with a single file. We should get just one match.

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 9589)
select distinct lib[1] from tmp where lib is not null;
       lib        
------------------
 lib/hamlibtcl.so
 lib/libhamlib.so
 lib/Hamlib.so
(3 rows)

Next, combine all the rows into one row, using array.

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 9589)
select array(select distinct lib[1] from tmp where lib is not null);
                       array                       
---------------------------------------------------
 {lib/hamlibtcl.so,lib/libhamlib.so,lib/Hamlib.so}
(1 row)

The last step, change that to JSON output.

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 9589)
select array_to_json(array(select distinct lib[1] from tmp where lib is not null));
                      array_to_json                      
---------------------------------------------------------
 ["lib/hamlibtcl.so","lib/libhamlib.so","lib/Hamlib.so"]
(1 row)

But…

I seems wasteful, or backward, to be converting to an array and then to JSON, but I think that’s the way it needs to be done for this data. I’m happy to have additional ideas if you have them.

EDIT: Magnus Hagander gave me the solution there. Use json_agg instead.

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 14152)
select json_agg(distinct lib[1]) from tmp where lib is not null;
                                              json_agg                                               
-----------------------------------------------------------------------------------------------------
 ["lib/libfreeipmi.so", "lib/libipmiconsole.so", "lib/libipmidetect.so", "lib/libipmimonitoring.so"]
(1 row)

A more complex example

Let’s try sysutils/freeipmi which has more librarries:

freshports.dev=# WITH tmp AS (
    select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
      from generate_plist
     where generate_plist.port_id = 14152)
select array_to_json(array(select distinct lib[1] from tmp where lib is not null));
                                          array_to_json                                           
--------------------------------------------------------------------------------------------------
 ["lib/libfreeipmi.so","lib/libipmidetect.so","lib/libipmiconsole.so","lib/libipmimonitoring.so"]
(1 row)

freshports.dev=# 

That seems to work.

Put it into a function

This is the function I created:

-- to convert the generate_plist table, multiple rows per port, into a
-- single row of JSON. Mostly used for Dependency line:
--
-- re https://github.com/FreshPorts/freshports/issues/216
--
CREATE OR REPLACE FUNCTION pkg_plist(bigint) returns json as $$
   WITH tmp AS (
       select regexp_match(installed_file, 'lib/[[:alnum:]]*?\.so') as lib 
         from generate_plist
        where generate_plist.port_id = $1
   )
   select array_to_json(array(select distinct lib[1] from tmp where lib is not null));
$$ LANGUAGE SQL STABLE;

The changed code

The original code was:

freshports.dev=# explain analyse
select pkg_plist, array_to_json(regexp_match(pkg_plist, 'lib/[[:alpha:]]*?\.so')) from ports where id = 34724;
                                                     QUERY PLAN                                                     
--------------------------------------------------------------------------------------------------------------------
 Index Scan using ports_pkey on ports  (cost=0.29..8.31 rows=1 width=190) (actual time=0.092..0.096 rows=1 loops=1)
   Index Cond: (id = 34724)
 Planning Time: 0.305 ms
 Execution Time: 0.138 ms
(4 rows)
freshports.dev=# 

The new code is:

freshports.dev=# explain analyse
freshports.dev-# select pkg_plist, pkg_plist(id) from ports where id = 34724;
                                                     QUERY PLAN                                                     
--------------------------------------------------------------------------------------------------------------------
 Index Scan using ports_pkey on ports  (cost=0.29..8.56 rows=1 width=190) (actual time=0.323..0.324 rows=1 loops=1)
   Index Cond: (id = 34724)
 Planning Time: 0.229 ms
 Execution Time: 0.340 ms
(4 rows)

That is an acceptable difference.

Thanks.

Feb 242021
 

This is a cheat sheet for FreshPorts behind-the-scenes administration. If you want one for users, please start with suggested content.

Ingress Actions

  1. git-single-commit.sh – Take a commit from the repo and insert it into the database. Will fail if commit is already there. REPO is one of src, ports, doc. Hash is the git commit has.
    /usr/local/libexec/freshports/git-single-commit.sh REPO HASH
    
  2. process vuxml – This will invoke vuxml processing of whatever is checked out into the cd ~freshports/ports-jail/var/db/repos/PORTS-head/ working copy of the ports. Tree. It might also work on the ~freshports/ports-jail/var/db/repos/ports (git) repo, but I don’t know how it tells which one to work on.
    echo touch /var/db/freshports/signals/vuxml /var/db/freshports/signals/job_waiting | sudo su -fm freshports
    

    Ahh, it knows based on VULNFILE=”${FRESHPORTS_JAIL_BASE_DIR}${PORTSDIR}/security/vuxml/vuln.xml”.

  3. vuln_latest.pl – refreshes the Latest Vulnerabilities box of the website.
    echo /usr/local/bin/perl /usr/local/libexec/freshports/vuln_latest.pl | sudo su -fm freshports
    
  4. Archive messages from 11 days ago
    echo /usr/local/libexec/freshports/archive-messages.sh 11 | sudo su -fm freshports
    

Ingress Services

  1. ingress – processes git commits into XML
  2. ingress_svn – processes subversion commits into XML
  3. freshports – processes the XML and loads them into the database

Caching

  • NOTIFY port_updated; – issue this command within the database. The fp_listen daemon will wake up, read the table cache_clearing_ports, and clear the cache for the listed port.