I really don't understand why both Laravel and October both have created such a misleading config/filesystems.php file?
Let me try to explain what I'm talking about:
Below is the current setup:
<?php
return [
/*
|--------------------------------------------------------------------------
| Default Filesystem Disk
|--------------------------------------------------------------------------
|
| Here you may specify the default filesystem disk that should be used
| by the framework. A "local" driver, as well as a variety of cloud
| based drivers are available for your choosing. Just store away!
|
| Supported: "local", "s3", "rackspace"
|
*/
'default' => 'local',
/*
|--------------------------------------------------------------------------
| Default Cloud Filesystem Disk
|--------------------------------------------------------------------------
|
| Many applications store files both locally and in the cloud. For this
| reason, you may specify a default "cloud" driver here. This driver
| will be bound as the Cloud disk implementation in the container.
|
*/
'cloud' => 's3',
/*
|--------------------------------------------------------------------------
| Filesystem Disks
|--------------------------------------------------------------------------
|
| Here you may configure as many filesystem "disks" as you wish, and you
| may even configure multiple disks of the same driver. Defaults have
| been setup for each driver as an example of the required options.
|
*/
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
's3' => [
'driver' => 's3',
'key' => 'your-key',
'secret' => 'your-secret',
'region' => 'your-region',
'bucket' => 'your-bucket',
],
'rackspace' => [
'driver' => 'rackspace',
'username' => 'your-username',
'key' => 'your-key',
'container' => 'your-container',
'endpoint' => 'https://identity.api.rackspacecloud.com/v2.0/',
'region' => 'IAD',
],
],
];
The above file is actually misleading and the label just doesn't make sense! Take the following part:
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
So suddenly local root becomes storage root which makes no sense at all and then messes people up trying to use the Storage:: File Storage, when trying to link to the real root of their app.
Local root should be local root.
Storage root should be storage root.
Below is a far less misleading config file example:
<?php
return [
/*
|--------------------------------------------------------------------------
| Default Filesystem Disk
|--------------------------------------------------------------------------
|
| Here you may specify the default filesystem disk that should be used
| by the framework. A "local" driver, as well as a variety of cloud
| based drivers are available for your choosing. Just store away!
|
| Supported: "storage", "local", "s3", "rackspace"
|
*/
'default' => 'storage',
/*
|--------------------------------------------------------------------------
| Default Cloud Filesystem Disk
|--------------------------------------------------------------------------
|
| Many applications store files both locally and in the cloud. For this
| reason, you may specify a default "cloud" driver here. This driver
| will be bound as the Cloud disk implementation in the container.
|
*/
'cloud' => 's3',
/*
|--------------------------------------------------------------------------
| Filesystem Disks
|--------------------------------------------------------------------------
|
| Here you may configure as many filesystem "disks" as you wish, and you
| may even configure multiple disks of the same driver. Defaults have
| been setup for each driver as an example of the required options.
|
*/
'disks' => [
'storage' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'local' => [
'driver' => 'local',
'root' => base_path(),
],
's3' => [
'driver' => 's3',
'key' => 'your-key',
'secret' => 'your-secret',
'region' => 'your-region',
'bucket' => 'your-bucket',
],
'rackspace' => [
'driver' => 'rackspace',
'username' => 'your-username',
'key' => 'your-key',
'container' => 'your-container',
'endpoint' => 'https://identity.api.rackspacecloud.com/v2.0/',
'region' => 'IAD',
],
],
];
Notice in the above code I have written a correct local path and made storage the new default (to not create a breaking change to people's plugins) - yet make the file far less misleading!
Laravel is an opinionated framework, and one of those opinions is that storage should be in a /storage/app directory if you're going to be storing it local on the server. You say it's confusing that Storage::local() refers to /projectPath/storage/app and not /projectPath; but myself and anyone that's ever used Laravel would find it confusing if random files & folders started appearing in our project root whenever we used Storage::local().
Even though Laravel is opinionated about how projects are setup, it's still configurable & extendable so you're free to do whatever you want to do in your own projects. We're not going to be making this change in October however as I don't see a good enough pro to make the change vs the massive cons of making the change.
Note that you're still free to have the setup you described in your own October based projects, we're just not going to be changing the default.
@LukeTowers
Back in the day Storage:get() was called File::get() so it's just a name change and trying to lock developers to use only the storage folder. However, developers still use the root for files such as robots.txt .HTaccess and sitemap.xml etc. I need to use that laravel facade in the root, to check on those files and make them dynamic. Any suggestions would be helpful.
@ayumi-cloud just define a new disk called root and access it with Storage::disk('root')->get('robots.txt');
or do it directly with file_get_contents(base_path('robots.txt')) etc.
@LukeTowers Thanks will investigate both.
Will put the solution here ( to help out people researching in the future)
In the end went for pure php.
Laravel | PHP
--- | ---
Storage::get() | file_get_contents()
Storage::put() | file_put_contents()
Storage::size() | filesize()
Storage::makeDirectory() | mkdir()
Storage::copy() | copy()
Storage::move() | move_uploaded_file()
Storage::exists() | file_exists()
Storage::delete | unlink()
@ayumi-cloud Maybe this can make your developing simplier, but developers dont use robots.txt and sitemap.xml in root folder they mostly do separate api route (as ocms plugin does https://github.com/rainlab/sitemap-plugin), separate cms page (as i did for my custom cms page: https://octobertricks.com/tricks/dynamic-robotstxt)
or in laravel its in public folder.
I would also recommend a custom plugin route for those sorts of files, that's what I do for my Google Domain Verification plugin.
@Samuell1 Thanks for the link.
@LukeTowers Thanks just looking at your plugin to compare @Samuell1 article.