7-zip is probably spending more time on I/O than compressing data. My suggestion is to use -mx9 (Ultra mode) and -md30 (1 GB dictionary size).
The default dictionary size is 16MB. In ultra mode, the default dictionary size is 64MB. Z-zip uses 10.5 time the size of the dictionary for memory buffers (-md30 raises the total memory use to around 11GB). Of course, large dictionary size means that the computer extracting the archive has to allocate that large dictionary also. You could also try -d28.
Increasing the number of threads or using the -v switch reduces compression (according to the command line documentation). Setting -v4095m should still allow recovery of lost files?
The -slp (Set Large Page mode) option might speed compression. Review the description and cautions at http://sevenzip.sourceforge.jp/chm/cmdline/switches/large_pages.htm
In general, with that much memory you do not need a paging file (except for saving crash dumps). If you have a large paging file, Windows will try to move most of the program's memory to the paging file and page in parts of the program's memory as needed. That will also slow the program considerably.