I tried the import implemented by nic gritttt-rss but it failed.
tt-rss v1.7.5, php 5.4.13, postgresql 9.2.3; dedicated hosting and multi user tt-rss
Runnig import.py with python 2.7.3 (doesn't seem to work with python 3) results in a gritttt-import.sql of 11MB (yes, long time greader here...) with 14792 inserts.
Then running "psql -U tt-rss -f gritttt-import.sql tt-rss &> log" the transaction is aborted because:
psql:gritttt-import.sql:63434: ERROR: index row size 4072 exceeds maximum 2712 for index "ttrss_entries_guid_key"
HINT: Values larger than 1/3 of a buffer page cannot be indexed.
Consider a function index of an MD5 hash of the value, or use full text indexing.
So my guess is that the index ttrss_entries_guid is using a btree and my stared/shared feeds include some very long entry.
I managed to isolate the "ofending" entry in the import generated sql and in the shared.json. Here are relevant parts (as far as I can tell):
* insert statement http://dpaste.org/ZwaPI/raw/
* original ofending entry http://bpaste.net/show/87488/
(notice that it's invalid json)
* corrected ofending entry http://bpaste.net/show/87490/
(had to use dpaste for the sql statement as bpaste decided that it contained spam.. whatever).
So the problem here might be the my takeout of google reader that is not a valid json at all. Nice. I'll try to edit and validate this huge json file and start the import again...