Fixed
Status Update
Comments
ia...@gmail.com <ia...@gmail.com> #2
Now the limit appears to be hard, I get the error each time. I reach this limit with
just a stub app and these packages installed:
appengine_monkey-0.1dev_r5-py2.5.egg/
Beaker-0.7.5-py2.5.egg/
decorator-2.2.0-py2.5.egg/
easy-install.pth
FormEncode-0.7.1-py2.5.egg/
Jinja-1.2-py2.5-linux-i686.egg/
Mako-0.1.8-py2.5.egg/
nose-0.10.1-py2.5.egg/
Paste-1.6-py2.5.egg/
PasteDeploy-1.3.1-py2.5.egg/
PasteScript-1.6.2-py2.5.egg/
Pylons-0.9.6.1-py2.5.egg/
Routes-1.7.1-py2.5.egg/
setuptools-0.6c7-py2.5.egg/
setuptools-0.6c8-py2.5.egg/
setuptools.pth
simplejson-1.7.1-py2.5.egg/
TestApp.egg-link
WebHelpers-0.3.2-py2.5.egg/
While removing setuptools 0.6c7 gets me down to 975 files, this is clearly too low of
a limit. (TurboGears 2 for instance would never install)
just a stub app and these packages installed:
appengine_monkey-0.1dev_r5-py2.5.egg/
Beaker-0.7.5-py2.5.egg/
decorator-2.2.0-py2.5.egg/
easy-install.pth
FormEncode-0.7.1-py2.5.egg/
Jinja-1.2-py2.5-linux-i686.egg/
Mako-0.1.8-py2.5.egg/
nose-0.10.1-py2.5.egg/
Paste-1.6-py2.5.egg/
PasteDeploy-1.3.1-py2.5.egg/
PasteScript-1.6.2-py2.5.egg/
Pylons-0.9.6.1-py2.5.egg/
Routes-1.7.1-py2.5.egg/
setuptools-0.6c7-py2.5.egg/
setuptools-0.6c8-py2.5.egg/
setuptools.pth
simplejson-1.7.1-py2.5.egg/
TestApp.egg-link
WebHelpers-0.3.2-py2.5.egg/
While removing setuptools 0.6c7 gets me down to 975 files, this is clearly too low of
a limit. (TurboGears 2 for instance would never install)
li...@gmail.com <li...@gmail.com> #3
[Comment deleted]
ch...@gmail.com <ch...@gmail.com> #4
I think it's hilarious that you Google guys are labeling all these "does not work"
bugs as features. Your site says Pylons works, but it didn't at launch, and the
"Pylons doesn't work" bug got labeled as a feature. People need to use libraries for
their apps, and many of these libraries have numerous small files. "Real-world apps
do not work on AppEngine" seems to me to be a bug, not a feature request.
bugs as features. Your site says Pylons works, but it didn't at launch, and the
"Pylons doesn't work" bug got labeled as a feature. People need to use libraries for
their apps, and many of these libraries have numerous small files. "Real-world apps
do not work on AppEngine" seems to me to be a bug, not a feature request.
ha...@gmail.com <ha...@gmail.com> #5
I think "Feature" means "Feature request," not wontfix.
gv...@gmail.com <gv...@gmail.com> #6
We encourage developers to find creative ways to reduce the number of files in an
app. For example, a zip file with many small files is much more efficient than many
small individual files. You can probably implement this yourself by using the
zipimport.py from the Python sandbox:
http://svn.python.org/view/sandbox/trunk/import_in_py/zipimport_/
app. For example, a zip file with many small files is much more efficient than many
small individual files. You can probably implement this yourself by using the
zipimport.py from the Python sandbox:
bj...@gmail.com <bj...@gmail.com> #7
"We encourage developers to find creative ways to reduce the number of files in an
app."
It's just so much easier to find another place to host our apps than it is to get
around the crazy restrictions all over Google app engine.
app."
It's just so much easier to find another place to host our apps than it is to get
around the crazy restrictions all over Google app engine.
de...@gmail.com <de...@gmail.com> #9
I just downloaded google's code review GAE app (Rietveld) and it was 1014 files (14
over the limit). See, it's hard to stay under 1000 files if you need to load
libraries.
Only about 40 files make up the actual app source code -- the rest is a copy of
django
over the limit). See, it's hard to stay under 1000 files if you need to load
libraries.
Only about 40 files make up the actual app source code -- the rest is a copy of
django
ro...@gmail.com <ro...@gmail.com> #10
If gape did proper setuptools integration, perhaps we could import eggs as zip's
instead of directories? This would cut down on necessary files by *alot* when using
other libs.
instead of directories? This would cut down on necessary files by *alot* when using
other libs.
ka...@gmail.com <ka...@gmail.com> #11
i was trying to get zope3 running well on appengine, and ran into the file limit
issue. i tried following guido's suggestion on this issue, namely to use the import
in py code, to simulate zipimport. unfortunately importlib relies on marshall.dumps,
which is not included on app engine, so its apparently not possible to use this as a
workaround to the file limit.
i blogged some details of the attempt for those interested, it does work ok on the
devserver. the code is attached
http://blog.kapilt.com/2008/05/30/zipped-packages-on-app-engine/
hopefully another solution to this issue will surface.
issue. i tried following guido's suggestion on this issue, namely to use the import
in py code, to simulate zipimport. unfortunately importlib relies on marshall.dumps,
which is not included on app engine, so its apparently not possible to use this as a
workaround to the file limit.
i blogged some details of the attempt for those interested, it does work ok on the
devserver. the code is attached
hopefully another solution to this issue will surface.
kc...@gmail.com <kc...@gmail.com> #12
We want Pylons support. The file restriction limit is making this unnecessarily impossible.
ka...@gmail.com <ka...@gmail.com> #13
re zipimport, i ended up commenting the marshal.dumps part in the importlib as i
realized it was extraneous (writing bytecode). on the appserver, the zipimport gets a
little farther in that actually imports a single module (no inter module
dependencies) from a zip package. on the dev server, intermodule dependencies are
automatically compiled and loaded by same call to the compile builtin.
can we please have a bump 1000 file limit or a viable workaround?
realized it was extraneous (writing bytecode). on the appserver, the zipimport gets a
little farther in that actually imports a single module (no inter module
dependencies) from a zip package. on the dev server, intermodule dependencies are
automatically compiled and loaded by same call to the compile builtin.
can we please have a bump 1000 file limit or a viable workaround?
bi...@gmail.com <bi...@gmail.com> #14
Number of *blobs* is not limited to 1000. That was a typo in the AppEngine docs and
has been fixed.
has been fixed.
do...@gmail.com <do...@gmail.com> #15
It seems lame that marshal.dumps isn't allowed. Is it really that dangerous? I
understand not having pickle, but marshal?
understand not having pickle, but marshal?
jo...@gmail.com <jo...@gmail.com> #16
[Comment deleted]
jo...@gmail.com <jo...@gmail.com> #17
Is there an explanation of the rationale behind the 1000 file limit? I don't quite
understand it.
understand it.
ma...@gmail.com <ma...@gmail.com> #18
This is so frickin' insanely bastard annoying. Starring.
gv...@gmail.com <gv...@gmail.com> #19
I'm attaching a file my_zipimport.py which provides a zip importer that I have tested
successfully with Rietveld and Django in the SDK as well as in production. The file
is open source, using the Apache 2.0 license.
I make no promises that the API will remain the same, but I expect that this will
show up in a future SDK. (Alas, it's too late for the next SDK release, which is
already in the moddle of QA.)
Source code modifications:
import my_zipipmport
my_zipimport.install()
sys.path.insert(0, 'django.zip')
CAVEAT: when using dev_appserver, somehow the default loader is tried first, and it
will find the copy of django 0.96 in the SDK even though the zip file is first on the
path. I'll have to figure out why that is; my work-around so far has been to remove,
rename or otherwise disable the <SDK>/lib/django/ directory.
To make a django.zip that fits in under 1 MB, I did the following in my Linux shell:
zip -q django.zip `find django -name .svn -prune -o -type f ! -name '*.pyc' ! -name
'*.[pm]o' -print`
The find command skips .svn directories and .pyc files, but also .po and .mo files
which have something to do with internationalization and can apparently be missed.
The resulting django.zip is under 0.8 MB.
I'm sure there are other things you could remove (e.g. the Django admin app, most db
backends, etc.) but that's a separate project.
successfully with Rietveld and Django in the SDK as well as in production. The file
is open source, using the Apache 2.0 license.
I make no promises that the API will remain the same, but I expect that this will
show up in a future SDK. (Alas, it's too late for the next SDK release, which is
already in the moddle of QA.)
Source code modifications:
import my_zipipmport
my_zipimport.install()
sys.path.insert(0, 'django.zip')
CAVEAT: when using dev_appserver, somehow the default loader is tried first, and it
will find the copy of django 0.96 in the SDK even though the zip file is first on the
path. I'll have to figure out why that is; my work-around so far has been to remove,
rename or otherwise disable the <SDK>/lib/django/ directory.
To make a django.zip that fits in under 1 MB, I did the following in my Linux shell:
zip -q django.zip `find django -name .svn -prune -o -type f ! -name '*.pyc' ! -name
'*.[pm]o' -print`
The find command skips .svn directories and .pyc files, but also .po and .mo files
which have something to do with internationalization and can apparently be missed.
The resulting django.zip is under 0.8 MB.
I'm sure there are other things you could remove (e.g. the Django admin app, most db
backends, etc.) but that's a separate project.
da...@gmail.com <da...@gmail.com> #20
Is there any plan to remove the limit?
sl...@gmail.com <sl...@gmail.com> #21
They seem to be storing the uploaded files in BigTable, based on bits I've heard in
the developers' interviews. If so they may be running up against the 1000-record
limit on query results. If true, this would give a nice logical explanation for the
problem, but unfortunately suggests it would be difficult to raise.
The problem with zipped packages is that some packages use ``__file__`` extensively.
While Setuptools discourages this because it prevents zipping, zipped packages have
never been a requirement anywhere. So if App Engine indirectly requires large
packages to be zipped, it places a significant burden on developers of existing
packages, which should at least be acknowledged.
the developers' interviews. If so they may be running up against the 1000-record
limit on query results. If true, this would give a nice logical explanation for the
problem, but unfortunately suggests it would be difficult to raise.
The problem with zipped packages is that some packages use ``__file__`` extensively.
While Setuptools discourages this because it prevents zipping, zipped packages have
never been a requirement anywhere. So if App Engine indirectly requires large
packages to be zipped, it places a significant burden on developers of existing
packages, which should at least be acknowledged.
ma...@gmail.com <ma...@gmail.com> #22
@21
I use a ton of packages in apps that get zipped to be bundled with py2exe and I never
seen a package misbehave because of that. So I think you overestimate how much of an
issue it is.
I use a ton of packages in apps that get zipped to be bundled with py2exe and I never
seen a package misbehave because of that. So I think you overestimate how much of an
issue it is.
ch...@gmail.com <ch...@gmail.com> #23
@22 py2exe produced packages can be temporarily unzipped, I think. appengine packages
can't.
can't.
ma...@gmail.com <ma...@gmail.com> #24
@23 yes they could but that never happens AFAICT. If you choose to zip the .dll /
.pyd files as well those would need to be unzipped, but that doesn't apply to
appengine anyway.
.pyd files as well those would need to be unzipped, but that doesn't apply to
appengine anyway.
da...@gmail.com <da...@gmail.com> #25
Well, what if the custom python interpreter somehow implemented the zimport module
seamlessly? That would make it much easier than having us have to zip up and package
all extra modules every release.
seamlessly? That would make it much easier than having us have to zip up and package
all extra modules every release.
py...@gmail.com <py...@gmail.com> #26
Pylons support please.
gv...@gmail.com <gv...@gmail.com> #27
Here's a new version, named py_zipimport. The instructions are different:
import sys
import py_zipimport
sys.path.insert(0, 'django.zip') # or whatever
Please give it a try.
import sys
import py_zipimport
sys.path.insert(0, 'django.zip') # or whatever
Please give it a try.
gv...@gmail.com <gv...@gmail.com> #28
PS. That still doesn't work in the SDK. Changes to the SDK are necessary to support
zipimport. Those will come in a future SDK version (the next SDK version is already
too far along in QA to include it).
zipimport. Those will come in a future SDK version (the next SDK version is already
too far along in QA to include it).
sl...@gmail.com <sl...@gmail.com> #29
Would it be possible to to post a patch for the SDK in the meantime?
gv...@gmail.com <gv...@gmail.com> #30
Sorry, no, I don't want to bypass our SDK QA process.
You don't need the zip import when using the SDK anyway -- it doesn't enforce the
limit on number of files.
You don't need the zip import when using the SDK anyway -- it doesn't enforce the
limit on number of files.
gc...@gmail.com <gc...@gmail.com> #31
So if I understand that right I can't run the same application on the SDK and on app
engine if I want to use py_zipimport?
engine if I want to use py_zipimport?
gv...@gmail.com <gv...@gmail.com> #32
@gcarothers: All you need to do is unzip the zipfile in your project when using the
SDK. And this is a temporary situation.
SDK. And this is a temporary situation.
br...@gmail.com <br...@gmail.com> #33
My application needs more than 1000 files.
I tried the bulkload gag initially, but it was unusably slow. Think it would have
been over 8 hours to load my initial 72mb of data.
So... I planned to upload xml files and search them with BOSS (hopefully) instead...
Now I need to find a work around to my work around's work around.
I can understand a file limit quota, but a file count quota?
I tried the bulkload gag initially, but it was unusably slow. Think it would have
been over 8 hours to load my initial 72mb of data.
So... I planned to upload xml files and search them with BOSS (hopefully) instead...
Now I need to find a work around to my work around's work around.
I can understand a file limit quota, but a file count quota?
gv...@gmail.com <gv...@gmail.com> #34
FWIW, zipimport support has been rolled out to production. SDK support will come
with the 1.1.3 SDK (really soon now).
with the 1.1.3 SDK (really soon now).
hu...@gmail.com <hu...@gmail.com> #35
That is great.
i would also appreciate for some instructions on this, when it is available.
i would also appreciate for some instructions on this, when it is available.
gv...@gmail.com <gv...@gmail.com> #36
An article about how to use this is forthcoming. Please stand by.
ka...@gmail.com <ka...@gmail.com> #37
Does the file limit exist because of app versioning (e.g. 50 applications versions
stored = 50000 files)?
If so, would it make sense to support unversioned files (icons, testfiles, docs,
junk, libraries) via the yaml-config?
stored = 50000 files)?
If so, would it make sense to support unversioned files (icons, testfiles, docs,
junk, libraries) via the yaml-config?
gv...@gmail.com <gv...@gmail.com> #38
No, it exists because of the sheer number of apps. E.g. 1,000,000 apps ==
1,000,000,000 files.
1,000,000,000 files.
no...@gmail.com <no...@gmail.com> #39
Would it be silly to suggest code just gets zipped automatically when placed on the appengine architecture? This
would certainly make it easier for the developers, although at a cost to Google's complexity possibly.
would certainly make it easier for the developers, although at a cost to Google's complexity possibly.
gv...@gmail.com <gv...@gmail.com> #40
No, the developer has to request zipping. Lots of code doesn't run correctly from
zip files, e.g. anything opening data files relative to __file__.
zip files, e.g. anything opening data files relative to __file__.
no...@gmail.com <no...@gmail.com> #41
One more idea. I wonder how many of the files developers uploaded have the exact same checksum? It seems
like this could present one way to limit the total files uploaded as the upload mechanism could keep a global
checksum database and each time a file was already uploaded to the google infrastructure it could just symlink
it, instead of create a new one.
like this could present one way to limit the total files uploaded as the upload mechanism could keep a global
checksum database and each time a file was already uploaded to the google infrastructure it could just symlink
it, instead of create a new one.
[Deleted User] <[Deleted User]> #42
I believe that App Engine is already doing this -- at least within user accounts.
I created a staging app with the same code as my main app and no files were uploaded on initial update.
I don't know whether it reuses files across all applications, but I wouldn't be surprised if it did. However, even
when reused, those files count against your limit.
I created a staging app with the same code as my main app and no files were uploaded on initial update.
I don't know whether it reuses files across all applications, but I wouldn't be surprised if it did. However, even
when reused, those files count against your limit.
no...@gmail.com <no...@gmail.com> #43
Another obvious alternative to dealing with this file limit problem could be to do what Google is doing with
JQuery hosting. If they don't want to support every web framework under the sun, which is really the issue here
in some ways, they could have a common or 3rd party area.
You could probably satisfy 90% of people's file limit problems, I would guess, by just having an "semi-unofficial"
framework depot that developers could use. To prevent security issues the files uploaded could be managed by
the framework developers, are at the least, could use checksums to compare the uploaded file against the file in
the framework developers public repository.
JQuery hosting. If they don't want to support every web framework under the sun, which is really the issue here
in some ways, they could have a common or 3rd party area.
You could probably satisfy 90% of people's file limit problems, I would guess, by just having an "semi-unofficial"
framework depot that developers could use. To prevent security issues the files uploaded could be managed by
the framework developers, are at the least, could use checksums to compare the uploaded file against the file in
the framework developers public repository.
sl...@gmail.com <sl...@gmail.com> #44
1000-file limit. Of course there would be a problem of versioning, especially with
frameworks that are changing rapidly. But if App Engine could provide directories we
could add to our Python path to enable a certain framework version, that would solve
that.
gv...@gmail.com <gv...@gmail.com> #45
I have floated this idea internally before. It may happen, but there are other
priorities.
priorities.
be...@gmail.com <be...@gmail.com> #46
@36 Guido, now that 1.1.3 is out, is there any documentation on including django.zip
and this zipserve I've been hearing about?
and this zipserve I've been hearing about?
wh...@gmail.com <wh...@gmail.com> #47
It would pretty awesome if GAE had a shared PyPI mirror. Then developers could just
add /pypi/somelibrary-someversion/ onto their path. Aside from likely being quite a
lot of work to set something like that up, maybe there are some reasons why this is a
silly idea?
There would be issues such as dealing with libraries that have C extensions (maybe if
PyPI required a GAE-safe-flag or something like that), and dealing with the
possibility of someone uploading something nasty onto PyPI in an attempt to inject
nasty code into apps.
add /pypi/somelibrary-someversion/ onto their path. Aside from likely being quite a
lot of work to set something like that up, maybe there are some reasons why this is a
silly idea?
There would be issues such as dealing with libraries that have C extensions (maybe if
PyPI required a GAE-safe-flag or something like that), and dealing with the
possibility of someone uploading something nasty onto PyPI in an attempt to inject
nasty code into apps.
am...@gmail.com <am...@gmail.com>
de...@gmail.com <de...@gmail.com> #48
I just tried this out with CherryPy and Mako on SDK version 1.1.5 and it seems to
work just fine. You don't need Guido's py_sipimport.py file that is in comment #27 .
Example code below:
import sys
sys.path.insert(0, 'cherrypy.zip')
sys.path.insert(0, 'mako.zip')
import cherrypy
from cherrypy import tools
from mako.template import Template
from mako.lookup import TemplateLookup
import wsgiref.handlers
... the rest of your code...
Worth repeating, I have not tested this thoroughly. I do not that CherryPy 3.1 does
sometime load using __file__, mostly in the tests and tutorials, but also a few times
in the plugins file. Hopefully the tools are not effected. I'll update once a more
thorough test is done. Mako should be fine as-is.
work just fine. You don't need Guido's py_sipimport.py file that is in
Example code below:
import sys
sys.path.insert(0, 'cherrypy.zip')
sys.path.insert(0, 'mako.zip')
import cherrypy
from cherrypy import tools
from mako.template import Template
from mako.lookup import TemplateLookup
import wsgiref.handlers
... the rest of your code...
Worth repeating, I have not tested this thoroughly. I do not that CherryPy 3.1 does
sometime load using __file__, mostly in the tests and tutorials, but also a few times
in the plugins file. Hopefully the tools are not effected. I'll update once a more
thorough test is done. Mako should be fine as-is.
gv...@gmail.com <gv...@gmail.com> #49
Yes, delagoya is right. Ever since 1.1.3 you don't need the py_zipimport.py any
more; this is all built in now. I'd remove it from this issue except I don't seem to
have that power.
more; this is all built in now. I'd remove it from this issue except I don't seem to
have that power.
no...@gmail.com <no...@gmail.com> #50
Google, thanks for working to fix the web framework issues like this, despite some heavy wining from people
like myself. I appreciate it!
like myself. I appreciate it!
gv...@gmail.com <gv...@gmail.com> #51
Would people have a problem if I *removed* the two comments with obsolete copies of
my [py_]zipimport.py, to save future readers wasted time if they download one of
these without reading the whole thread?
my [py_]zipimport.py, to save future readers wasted time if they download one of
these without reading the whole thread?
bl...@gmail.com <bl...@gmail.com> #52
Some editing would probably be a good idea.
The best option might be to write an article for
http://code.google.com/appengine/articles/ about how to use zipped eggs on AppEngine,
then add links to your old comments.
The best option might be to write an article for
then add links to your old comments.
gv...@gmail.com <gv...@gmail.com> #53
Unfortunately I cannot do any editing -- I can either delete comments 19 and 27
completely, or leave them unchanged.
An article is already there:
http://code.google.com/appengine/articles/django10_zipimport.html
completely, or leave them unchanged.
An article is already there:
Jo...@hotmail.com <Jo...@hotmail.com> #54
The subject of comment 52 sounds like the right thing to do to me.
de...@gmail.com <de...@gmail.com> #55
Well, good thing that article is so thorough, saves me the trouble of writing one up
on my new GAE focused blog (import shameless.plug ;)http://appmecha.wordpress.com )
One note, the fact that the article is Django focused may hide the fact that this is
a general solution for import of any libraries that an application may need (provided
you test that it does work with your library).
I created a recipe in the cookbook that is basically my test above and posted a link
back to the Django article.
http://appengine-cookbook.appspot.com/recipe/respect-the-file-quota-with-zipimport
on my new GAE focused blog (import shameless.plug ;)
One note, the fact that the article is Django focused may hide the fact that this is
a general solution for import of any libraries that an application may need (provided
you test that it does work with your library).
I created a recipe in the cookbook that is basically my test above and posted a link
back to the Django article.
ia...@gmail.com <ia...@gmail.com> #56
When using a large Javascript library all the static files also count against the
quota. For instance, Xinha (a WYSIWYG library) has 851 files in its core, plugins,
and modules (excluding docs, examples, etc). Obviously this leaves far too few files
for the application. Some of this can be saved with skip_files (though simple
getting down to 851 files requires skip_files) and combining Javascript, but there
are also valid reasons to want those files separate, and things like icons can't be
combined.
Supporting zipimport helps, but the limit remains very low for anyone using existing
libraries, and of course zipimport is not applicable to Javascript libraries.
quota. For instance, Xinha (a WYSIWYG library) has 851 files in its core, plugins,
and modules (excluding docs, examples, etc). Obviously this leaves far too few files
for the application. Some of this can be saved with skip_files (though simple
getting down to 851 files requires skip_files) and combining Javascript, but there
are also valid reasons to want those files separate, and things like icons can't be
combined.
Supporting zipimport helps, but the limit remains very low for anyone using existing
libraries, and of course zipimport is not applicable to Javascript libraries.
no...@gmail.com <no...@gmail.com> #57
It seems like the "Google way", would be to use this:
http://code.google.com/apis/ajaxlibs/documentation/#googleDotLoad
But, this does create a bottleneck if someone wants to use something that isn't currently supported. It might be
nice to have a "Google Video" type system where users could upload their own libraries, and then the space
would still be conserved as something on the back end could do a checksum to detect if these files had already
been loaded.
But, this does create a bottleneck if someone wants to use something that isn't currently supported. It might be
nice to have a "Google Video" type system where users could upload their own libraries, and then the space
would still be conserved as something on the back end could do a checksum to detect if these files had already
been loaded.
gv...@gmail.com <gv...@gmail.com> #58
Again, how about zipserve?
Jo...@hotmail.com <Jo...@hotmail.com> #59
I'd guess what is meant by zipserve is
http://code.google.com/p/googleappengine/source/browse/trunk/google/appengine/ext/zipserve/__init__.py
That could do the trick, but I can find no reference to it in the docs, which would
likely explain why most of us don't seem to know about it.
If zipimport and zipserve are going to be the official workaround to the 1000 file
limit and that limit is not going to be removed it would be helpful if the visibility
of both zipimport and zipserve were increased. If the samples/tutorials included
their use then folks would be using them from the get go and there would be no need
to refactor latter when one runs into the 1000 file limit.
That could do the trick, but I can find no reference to it in the docs, which would
likely explain why most of us don't seem to know about it.
If zipimport and zipserve are going to be the official workaround to the 1000 file
limit and that limit is not going to be removed it would be helpful if the visibility
of both zipimport and zipserve were increased. If the samples/tutorials included
their use then folks would be using them from the get go and there would be no need
to refactor latter when one runs into the 1000 file limit.
gv...@gmail.com <gv...@gmail.com> #60
We will document zipserve -- it was an oversight that it wasn't documented before.
For now, the docstrings from
http://code.google.com/p/googleappengine/source/browse/trunk/google/appengine/ext/zipserve/__init__.py
should serve as plenty of documentation to get you started!
For now, the docstrings from
should serve as plenty of documentation to get you started!
Jo...@hotmail.com <Jo...@hotmail.com> #61
Just saw this thread on the appengine group,
http://groups.google.com/group/google-appengine/browse_thread/thread/400c37cc773b9f46
. The poster indicates that using zipimport puts in a floor of about 500ms on
request processing.
. The poster indicates that using zipimport puts in a floor of about 500ms on
request processing.
gv...@gmail.com <gv...@gmail.com> #62
Odd. It should only happen on the first request (for a specific process). I can
confirm that this is not a problem for Rietveld, for example -- so there must be
something in that app that keeps triggering the unzip.
confirm that this is not a problem for Rietveld, for example -- so there must be
something in that app that keeps triggering the unzip.
je...@gmail.com <je...@gmail.com> #63
could someone confirm whether or not this 1000 blob limit actually exists? comment
#14 claims this was only a typo in the documentation. if so, this "issue" should be
deleted, no?
#14 claims this was only a typo in the documentation. if so, this "issue" should be
deleted, no?
jo...@google.com <jo...@google.com> #64
The limit is 1000 code files and 1000 static files. Each file may be up to 10MB.
Also, the total size of all code files must remain under 150MB (there is no analogous
limit on static files).
Also, to briefly summarize, at launch we had very tight limits and no workarounds
(i.e. the limits were 1MB per file). Then, in July-September we rolled out zipimport
and zipserve, which provide very nice workarounds by zipping small code or static
files. Then, last week, we raised the per-file size limit from 1MB to 10MB.
At this point it would be worthwhile to reflect on this issue and determine what is
the most salient pain point to attack next. Do some people still need the number of
code or static files to increase? What new limit would make your app work? Is there
something else that you would really like instead to get your app to work?
Also, the total size of all code files must remain under 150MB (there is no analogous
limit on static files).
Also, to briefly summarize, at launch we had very tight limits and no workarounds
(i.e. the limits were 1MB per file). Then, in July-September we rolled out zipimport
and zipserve, which provide very nice workarounds by zipping small code or static
files. Then, last week, we raised the per-file size limit from 1MB to 10MB.
At this point it would be worthwhile to reflect on this issue and determine what is
the most salient pain point to attack next. Do some people still need the number of
code or static files to increase? What new limit would make your app work? Is there
something else that you would really like instead to get your app to work?
pr...@gmail.com <pr...@gmail.com> #65
Having to zip up stuff can still be problematic for libraries that do not support
such. Stuff big enough to need zipping in the first place generally use a large
number of asset files, not strictly code files.
Either removing or upping the code file limit to around 10,000 would alleviate most
any issue is my feeling. Much like the 1MB size limit was a bit on the short side, so
is 1,000 code file limit. It doesn't take more than a couple decently sized libraries
to break it.
such. Stuff big enough to need zipping in the first place generally use a large
number of asset files, not strictly code files.
Either removing or upping the code file limit to around 10,000 would alleviate most
any issue is my feeling. Much like the 1MB size limit was a bit on the short side, so
is 1,000 code file limit. It doesn't take more than a couple decently sized libraries
to break it.
le...@gmail.com <le...@gmail.com> #66
Well considering so many projects seem to have to take special code paths just for
appengine, making them also try to squeeze their frameworks down below 1000files is
asking a lot in my opinion.
To alleviate this issue would be meeting them halfway on a lot of things, making
appengine that much less of a hassle to get into, and more worthwhile.
appengine, making them also try to squeeze their frameworks down below 1000files is
asking a lot in my opinion.
To alleviate this issue would be meeting them halfway on a lot of things, making
appengine that much less of a hassle to get into, and more worthwhile.
sl...@gmail.com <sl...@gmail.com> #67
This bug/feature affects the java implementation as well so when using smartgwt I had
to create a java version of the zipserve utility.
I'd rather prefer to have the file limit raised though (smartgwt has about 1500
static files).
to create a java version of the zipserve utility.
I'd rather prefer to have the file limit raised though (smartgwt has about 1500
static files).
ki...@gmail.com <ki...@gmail.com> #68
users/clients in these countries are not able to access your GAE services with your
own domain name.
See
ew...@gmail.com <ew...@gmail.com> #69
The limit of 1000 files was really a very bad idea. When does it get changed. How do
I get notified.
I get notified.
ar...@gmail.com <ar...@gmail.com> #70
I can this is still an issue as it has been posted more than a year ago. This is
affecting my program too. I am using Google App Engine on Java and that is a shame
that we can't upload past that limit. Also, the solution using ZipPackages doesn't
work for JAVA or at least it didn't work when i tried it
http://code.google.com/p/app-engine-patch/wiki/ZipPackages . Either fix this issue by
providing some workaround or just let us upload WAR files directly.
affecting my program too. I am using Google App Engine on Java and that is a shame
that we can't upload past that limit. Also, the solution using ZipPackages doesn't
work for JAVA or at least it didn't work when i tried it
providing some workaround or just let us upload WAR files directly.
de...@gmail.com <de...@gmail.com> #71
Such limit in the number of files is obsolete nowadays.
It creates extra complexity, and reduce lib code reuse.
It creates extra complexity, and reduce lib code reuse.
ar...@gmail.com <ar...@gmail.com> #72
I found out the source of problems. It is not in lib but the amount of images and
other files the application is trying to upload. If it was save as a WAR file this
problem would have been bypassed. So anybody knows of workaround or fix?
other files the application is trying to upload. If it was save as a WAR file this
problem would have been bypassed. So anybody knows of workaround or fix?
ye...@gmail.com <ye...@gmail.com> #73
Hi,
This limitation seems a bit unfair to Java and especially GWT. When compiled, each
class, including top-level, nested, local and anonymous, gets its own .class file,
which ends up in the WEB-INF/classes folder. I only have 433 Java files in my source
tree, but they become 830 class files in WEB-INF/classes. This is not so much the
fault of the server-side code. It is the GWT code, which, like with many
component-oriented GUI frameworks, relies a lot on anonymous classes used as even
handlers and such.
I think WEB-INF/classes directory should count as one file. Google Eclipse plugin
could simply jar it up and send it over to app-engine as a single jar file. That
would solve the problem.
I have submitted a specific request regarding the WEB-INF/classes folder as a
separate issue. Feel free to comment and star it:
http://code.google.com/p/googleappengine/issues/detail?id=1579
Thanks,
Yegor
This limitation seems a bit unfair to Java and especially GWT. When compiled, each
class, including top-level, nested, local and anonymous, gets its own .class file,
which ends up in the WEB-INF/classes folder. I only have 433 Java files in my source
tree, but they become 830 class files in WEB-INF/classes. This is not so much the
fault of the server-side code. It is the GWT code, which, like with many
component-oriented GUI frameworks, relies a lot on anonymous classes used as even
handlers and such.
I think WEB-INF/classes directory should count as one file. Google Eclipse plugin
could simply jar it up and send it over to app-engine as a single jar file. That
would solve the problem.
I have submitted a specific request regarding the WEB-INF/classes folder as a
separate issue. Feel free to comment and star it:
Thanks,
Yegor
ne...@gmail.com <ne...@gmail.com> #74
This is in deed a major issue for GWT apps.
What is the workaround, please?
What is the workaround, please?
ye...@gmail.com <ye...@gmail.com> #75
Hi, nexource,
The workaround is to use a build script and command-line tools instead of the Eclipse
plugin. Instruct your build script to jar all files from WEB-INF/classes and put the
jar file into WEB-INF/lib, then delete WEB-INF/classes. Here's an Ant script snippet
for you:
<jar destfile="${dist.dir}/WEB-INF/lib/yourappname.jar">
<fileset dir="${dist.dir}/WEB-INF/classes">
<include name="**/*" />
</fileset>
</jar>
<delete dir="${dist.dir}/WEB-INF/classes" />
My ${dist.dir} is a copy of the "war" directory so the script leaves the original
files untouched, otherwise you may run into complications with Eclipse. Eclipse does
not play well with file-system changes made by external tools, such as Ant.
You will find a good tutorial here:
http://code.google.com/appengine/docs/java/tools/ant.html
Hope this helps.
Yegor
The workaround is to use a build script and command-line tools instead of the Eclipse
plugin. Instruct your build script to jar all files from WEB-INF/classes and put the
jar file into WEB-INF/lib, then delete WEB-INF/classes. Here's an Ant script snippet
for you:
<jar destfile="${dist.dir}/WEB-INF/lib/yourappname.jar">
<fileset dir="${dist.dir}/WEB-INF/classes">
<include name="**/*" />
</fileset>
</jar>
<delete dir="${dist.dir}/WEB-INF/classes" />
My ${dist.dir} is a copy of the "war" directory so the script leaves the original
files untouched, otherwise you may run into complications with Eclipse. Eclipse does
not play well with file-system changes made by external tools, such as Ant.
You will find a good tutorial here:
Hope this helps.
Yegor
ne...@gmail.com <ne...@gmail.com> #76
Hi Yegor,
Thanks for the reply!
What about when you have a lot of resources as GWT-compiled files or small images?
Cheers
Rudi
Thanks for the reply!
What about when you have a lot of resources as GWT-compiled files or small images?
Cheers
Rudi
ye...@gmail.com <ye...@gmail.com> #77
nexource,
Usually GWT-compiler does not produce a lot of files (unless you are localizing for
all countries and dialects in the world). It's the Java compiler that will produce a
lot of files. The latter is solved by jarring everything in WEB-INF/classes into a
single archive. As for lots of small images, ImageBundle should help you solve it,
explanation here:
http://tinyurl.com/oqayat
Yegor
Usually GWT-compiler does not produce a lot of files (unless you are localizing for
all countries and dialects in the world). It's the Java compiler that will produce a
lot of files. The latter is solved by jarring everything in WEB-INF/classes into a
single archive. As for lots of small images, ImageBundle should help you solve it,
explanation here:
Yegor
ab...@gmail.com <ab...@gmail.com> #78
Hi,
I am using tatami package in my application which provides wrappers for dojo.
The js files of dojo are created in the "projectroot/war/projectname/dojo" folder
during compilation and counts more then 1000. while uploading with eclipse plugin it
says:
java.io.IOException: Error posting to URL:
http://appengine.google.com/api/appversion/addblob?path=__static__%2Fboxadder%2Fdijit%2FEditor.js&app_id=b-tracker&version=1&
400 Bad Request
Max number of files and blobs is 1000
I zipped dojo folder and kept it in /projectroot/war/projectname/ which is the
location of dojo directory and removed dojo folder, as described at
http://code.google.com/p/app-engine-patch/wiki/ZipPackages . After this application
get uploaded on app engine as number of files is less then 1000 now but,then my
application stops working as it is unable to find the required js files.
Does ZipPackages patch works for java projects as well? any workaround to get a
solution for this problem?
I am using tatami package in my application which provides wrappers for dojo.
The js files of dojo are created in the "projectroot/war/projectname/dojo" folder
during compilation and counts more then 1000. while uploading with eclipse plugin it
says:
java.io.IOException: Error posting to URL:
400 Bad Request
Max number of files and blobs is 1000
I zipped dojo folder and kept it in /projectroot/war/projectname/ which is the
location of dojo directory and removed dojo folder, as described at
get uploaded on app engine as number of files is less then 1000 now but,then my
application stops working as it is unable to find the required js files.
Does ZipPackages patch works for java projects as well? any workaround to get a
solution for this problem?
na...@gmail.com <na...@gmail.com> #79
For anyone using popular JS frameworks, Google does host them on their ajaxapis
server. You can use google.load if you want (apparently can do some geolocation
optimization), however you can access directly using urls described here:
http://code.google.com/apis/ajaxlibs/documentation/#AjaxLibraries
That way you don't have to host these in your app (and they come from a pretty fast
CDN too!) The only issue you might run into is that for frameworks like dojo and
others, behaviour can be issue with cross-domain javascript.
Another thing is to configure the "static-files" directive in the appengine-web
deployment descriptors. Most people don't know, but typically files are double
counted as static and resource files (especially bad when you have a lot of
javascript/css/html resources). Typically that saves a ton when deploying your app
and running into the file limits.
Going ahead, there are various things that I think should be done. I'm not
completely against the limits (it is a shared resource architecture, so you can't
kill them for trying to limit stuff for performance reasons). However, I think that
all types of files should be handled separately and have their own sets of limits.
Since they are handled separately when serving, I think this would work out a bit
better. Also, ,ost people exceed these limits because they are dealing extra static
resources and framework files for the most part. Google can already deal with
javascript frameworks using the Ajax Libraries API, maybe they can come with a
solution for others that use frameworks like GWT and Django so they don't have to
host those along with their app.
server. You can use google.load if you want (apparently can do some geolocation
optimization), however you can access directly using urls described here:
That way you don't have to host these in your app (and they come from a pretty fast
CDN too!) The only issue you might run into is that for frameworks like dojo and
others, behaviour can be issue with cross-domain javascript.
Another thing is to configure the "static-files" directive in the appengine-web
deployment descriptors. Most people don't know, but typically files are double
counted as static and resource files (especially bad when you have a lot of
javascript/css/html resources). Typically that saves a ton when deploying your app
and running into the file limits.
Going ahead, there are various things that I think should be done. I'm not
completely against the limits (it is a shared resource architecture, so you can't
kill them for trying to limit stuff for performance reasons). However, I think that
all types of files should be handled separately and have their own sets of limits.
Since they are handled separately when serving, I think this would work out a bit
better. Also, ,ost people exceed these limits because they are dealing extra static
resources and framework files for the most part. Google can already deal with
javascript frameworks using the Ajax Libraries API, maybe they can come with a
solution for others that use frameworks like GWT and Django so they don't have to
host those along with their app.
iq...@gmail.com <iq...@gmail.com> #80
Please solve it.
gv...@gmail.com <gv...@gmail.com> #81
I expect this limit will increase to 3000 soon.
ma...@gmail.com <ma...@gmail.com> #82
@gvanrossum: we all hope very soon!
a....@gmail.com <a....@gmail.com> #83
...I just uploaded 1350 files... Anyone else notice this?
...Mind you, that's just what Eclipse tells me it has cloned; I set up some filtering
to tell the plugin to NOT include .client. packages, AND that
pre-upload-compile-to-jar-with-ant trick {still 1300 AFTER compiling my more static
modules}.
Perhaps someone else can try a few thousand files to see if we can finally loosen our
belts and stop worrying about how many small files and inline classes we use...
...Mind you, that's just what Eclipse tells me it has cloned; I set up some filtering
to tell the plugin to NOT include .client. packages, AND that
pre-upload-compile-to-jar-with-ant trick {still 1300 AFTER compiling my more static
modules}.
Perhaps someone else can try a few thousand files to see if we can finally loosen our
belts and stop worrying about how many small files and inline classes we use...
wk...@gmail.com <wk...@gmail.com> #84
Cool, this is what I got with app-engine-patch when uploading lots of files:
Max number of files and blobs is 3000.
Max number of files and blobs is 3000.
gv...@gmail.com <gv...@gmail.com> #85
All, the combined limit on static and code files has indeed increased to 3000. There
is no plan to increase it further. The following limits are also still in place:
150 MB max combined size of code files
10 MB max individual size of any file
1000 files max per directory (not counting files in subdirectories)
In the quoted message, "blob" refers to static files; "file" refers to code files.
is no plan to increase it further. The following limits are also still in place:
150 MB max combined size of code files
10 MB max individual size of any file
1000 files max per directory (not counting files in subdirectories)
In the quoted message, "blob" refers to static files; "file" refers to code files.
ki...@google.com <ki...@google.com> #86
Adding descriptive text to this in the hope that Google search picks it up: What is
the maximum number of static files that you can upload to an Google AppEngine
application? As of Jan 2010, this is 3000 -- see gvanrossum's breakdown on this
(comment 86)
the maximum number of static files that you can upload to an Google AppEngine
application? As of Jan 2010, this is 3000 -- see gvanrossum's breakdown on this
(comment 86)
ga...@gmail.com <ga...@gmail.com> #87
Taking http://code.google.com/p/googleappengine/issues/detail?id=161#c68 as the base, I have been able to put up SmartGWT on GAE at
http://mastergaurav.appspot.com
Thanks slindhom!
I have made one minor change in the servlet -- look for "If-Modified-Since" header and return a 304.
According to W3C, athttp://www.w3.org/Protocols/HTTP/HTRQ_Headers.html#if-modified-since
- if the requested document has not changed since the time specified in this field the document will not be sent, but instead a Not Modified 304 reply.
The servlet does not send any content in case of a 304-response.
-Gaurav
http://www.mastergaurav.com
Thanks slindhom!
I have made one minor change in the servlet -- look for "If-Modified-Since" header and return a 304.
According to W3C, at
- if the requested document has not changed since the time specified in this field the document will not be sent, but instead a Not Modified 304 reply.
The servlet does not send any content in case of a 304-response.
-Gaurav
ga...@gmail.com <ga...@gmail.com> #88
se...@gmail.com <se...@gmail.com> #90
please allow read static file larger than 1mb. read enough without write enough is enough. it's ease us having to write utility to read from blobstore.
wl...@gmail.com <wl...@gmail.com> #91
It would have been nice if appcfg will just abort if I'm uploading more than 3000, and not just process it for half hour and give me the out of limit error.
je...@gmail.com <je...@gmail.com> #92
Was this limit lowered back to 1000? I have 1014 files and I keep getting an error stating "backend null" (nice error descriptions!)
I think I'll be ditching App Engine. It was nice & easy deploying right from Eclipse, but this file limitation just leaves me with no choice.
I think I'll be ditching App Engine. It was nice & easy deploying right from Eclipse, but this file limitation just leaves me with no choice.
sc...@google.com <sc...@google.com> #93
I don't think the error message you are getting is related to the number of files you have.
bo...@gmail.com <bo...@gmail.com> #94
SUppose am working on an application where there will be massive number of pictures coming in my app and need to be stored for further use, does this mean that there wont be limit on how many images should be stored in the images directory or this will mean that once my limit of 3000 is reached; no more images will be uploaded?
And if that's the case whats an alternative, use google cloud storage?
And
And if that's the case whats an alternative, use google cloud storage?
And
mc...@gmail.com <mc...@gmail.com> #95
i was trying to deployed my WebApp today on Google App Engine using Eclipse.
i got an error saying:
"Max number of files and blobs is 10000.
See the deployment console for more details"
Really?
My WebApp contain: 826 Files, 106 Folders
What are you GOOGELERS are on about with your "Max number of files and blobs is 10000" - i think it is time with ditching GoogleAppEngine :)
i got an error saying:
"Max number of files and blobs is 10000.
See the deployment console for more details"
Really?
My WebApp contain: 826 Files, 106 Folders
What are you GOOGELERS are on about with your "Max number of files and blobs is 10000" - i think it is time with ditching GoogleAppEngine :)
[Deleted User] <[Deleted User]> #96
Hi,
Did you find any solution #96?
I'm in the same situation.
Did you find any solution #96?
I'm in the same situation.
ch...@gmail.com <ch...@gmail.com> #97
I was able to reduce the number of files by editing the app.yaml file to include the supported libraries rather than uploading these files such as Jinja2 and markupsafe. There is a list of supported libraries on appengine's site:
https://developers.google.com/appengine/docs/python/tools/libraries27
I plan to move static files to another server such as AWS S3. I'm still looking for more "creative" ways to reduce the number of files.
Example of the edits on app.yaml file:
libraries:
- name: PIL
version: latest
- name: webob
version: latest
- name: webapp2
version: "2.5.2"
- name: jinja2
version: latest
- name: markupsafe
version: latest
I plan to move static files to another server such as AWS S3. I'm still looking for more "creative" ways to reduce the number of files.
Example of the edits on app.yaml file:
libraries:
- name: PIL
version: latest
- name: webob
version: latest
- name: webapp2
version: "2.5.2"
- name: jinja2
version: latest
- name: markupsafe
version: latest
er...@gmail.com <er...@gmail.com> #98
I am using PHP on GAE and this 10000 files limit is not possible with most of the PHP Apps. It is the normal anatomy to have a lot of file and exceecing the 10000 limit with PHP apps. Not even standard software like wordpress and drupal will work to upload. Pls raise this limit again
wi...@gmail.com <wi...@gmail.com> #99
appcfg.py: error: Error parsing C:\newpro\app.yaml: Found more than 100 URLMap entries in application configuration
in "C:\newpro\app.yaml"
i have faced this error. please give any suggesion
in "C:\newpro\app.yaml"
i have faced this error. please give any suggesion
de...@tyo.com.au <de...@tyo.com.au> #100
[Comment deleted]
be...@rogmansmedia.nl <be...@rogmansmedia.nl> #101
Trying to run a PHP app on GAE: can not deploy because I have to many files.
I removed the *Google API Client* from composer, good for *4500 files*.
So come on Google, raise that limit today.
I removed the *Google API Client* from composer, good for *4500 files*.
So come on Google, raise that limit today.
Description
2008-04-11 22:13:46,924 ERROR __init__.py:1294 An unexpected error
occurred. Aborting.
Rolling back the update.
Error 400: --- begin server output ---
Max number of files and blobs is 1000.
It doesn't happen every time, though, only about every other time.
Currently my application just reached 1001 files. This is about a dozen
libraries, so if there really is some limit at 1000 files it really needs
to be raised.