Just to let you know: People are helping me to make the import of starred and shared articles from Google Reader work not only for MySQL databases, but also Postgres.
See progress here: https://github.com/nhoening/gritttt-rss/issues/37
The import script is here: https://github.com/nhoening/gritttt-rss ... der-import
Import shared/starred items from GReader: Postgres support
- fox
- ^ me reading your posts ^
- Posts: 6318
- Joined: 27 Aug 2005, 22:53
- Location: Saint-Petersburg, Russia
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
Can someone share an example of this so I could look into making an import plugin already? I totally promise not to laugh at your silly starred items.
Re: Import shared/starred items from GReader: Postgres suppo
I could give you a JSON snippet from my old withered Google account. But why would you need an example when you can see in my import script what the relevant parts are that need to be pulled out from it?
- fox
- ^ me reading your posts ^
- Posts: 6318
- Joined: 27 Aug 2005, 22:53
- Location: Saint-Petersburg, Russia
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
I dunno, it feels easier somehow. Anyway nvm, I just starred some articles and made my own.
Re: Import shared/starred items from GReader: Postgres suppo
Cool. No shared items, though?
- fox
- ^ me reading your posts ^
- Posts: 6318
- Joined: 27 Aug 2005, 22:53
- Location: Saint-Petersburg, Russia
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
Do those have different json format? I don't have any.
- fox
- ^ me reading your posts ^
- Posts: 6318
- Joined: 27 Aug 2005, 22:53
- Location: Saint-Petersburg, Russia
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
Nah, same format. Imported fine. They are set starred, but it could be made optional.
Edit: I'll make it go by filename, if starred.json then set starred. Updated in trunk.
Edit2: made a separate thread as to not shit up this one.
Edit: I'll make it go by filename, if starred.json then set starred. Updated in trunk.
Edit2: made a separate thread as to not shit up this one.
Re: Import shared/starred items from GReader: Postgres suppo
If it is easy, in addition to shared items, an import of labels would also be nice.
That said this will only be used mostly over the next three months..
That said this will only be used mostly over the next three months..
-
- Bear Rating Trainee
- Posts: 4
- Joined: 29 Mar 2013, 13:01
- Location: Amsterdam, The Netherlands
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
I tried the import implemented by nic gritttt-rss but it failed.
tt-rss v1.7.5, php 5.4.13, postgresql 9.2.3; dedicated hosting and multi user tt-rss
Runnig import.py with python 2.7.3 (doesn't seem to work with python 3) results in a gritttt-import.sql of 11MB (yes, long time greader here...) with 14792 inserts.
Then running "psql -U tt-rss -f gritttt-import.sql tt-rss &> log" the transaction is aborted because:
psql:gritttt-import.sql:63434: ERROR: index row size 4072 exceeds maximum 2712 for index "ttrss_entries_guid_key"
HINT: Values larger than 1/3 of a buffer page cannot be indexed.
Consider a function index of an MD5 hash of the value, or use full text indexing.
So my guess is that the index ttrss_entries_guid is using a btree and my stared/shared feeds include some very long entry.
I managed to isolate the "ofending" entry in the import generated sql and in the shared.json. Here are relevant parts (as far as I can tell):
* insert statement http://dpaste.org/ZwaPI/raw/
* original ofending entry http://bpaste.net/show/87488/ (notice that it's invalid json)
* corrected ofending entry http://bpaste.net/show/87490/
(had to use dpaste for the sql statement as bpaste decided that it contained spam.. whatever).
So the problem here might be the my takeout of google reader that is not a valid json at all. Nice. I'll try to edit and validate this huge json file and start the import again...
tt-rss v1.7.5, php 5.4.13, postgresql 9.2.3; dedicated hosting and multi user tt-rss
Runnig import.py with python 2.7.3 (doesn't seem to work with python 3) results in a gritttt-import.sql of 11MB (yes, long time greader here...) with 14792 inserts.
Then running "psql -U tt-rss -f gritttt-import.sql tt-rss &> log" the transaction is aborted because:
psql:gritttt-import.sql:63434: ERROR: index row size 4072 exceeds maximum 2712 for index "ttrss_entries_guid_key"
HINT: Values larger than 1/3 of a buffer page cannot be indexed.
Consider a function index of an MD5 hash of the value, or use full text indexing.
So my guess is that the index ttrss_entries_guid is using a btree and my stared/shared feeds include some very long entry.
I managed to isolate the "ofending" entry in the import generated sql and in the shared.json. Here are relevant parts (as far as I can tell):
* insert statement http://dpaste.org/ZwaPI/raw/
* original ofending entry http://bpaste.net/show/87488/ (notice that it's invalid json)
* corrected ofending entry http://bpaste.net/show/87490/
(had to use dpaste for the sql statement as bpaste decided that it contained spam.. whatever).
So the problem here might be the my takeout of google reader that is not a valid json at all. Nice. I'll try to edit and validate this huge json file and start the import again...
- fox
- ^ me reading your posts ^
- Posts: 6318
- Joined: 27 Aug 2005, 22:53
- Location: Saint-Petersburg, Russia
- Contact:
Re: Import shared/starred items from GReader: Postgres suppo
Edit: in effort to not shit up this thread, posts related to the other plugin moved to the plugin thread.
viewtopic.php?f=1&t=1573
viewtopic.php?f=1&t=1573
Who is online
Users browsing this forum: No registered users and 3 guests